Shirin Dora

ORCID: 0000-0001-6182-4124
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Neural dynamics and brain function
  • Advanced Memory and Neural Computing
  • Neural Networks and Applications
  • Neural Networks and Reservoir Computing
  • Ferroelectric and Negative Capacitance Devices
  • EEG and Brain-Computer Interfaces
  • Visual perception and processing mechanisms
  • Effects of Environmental Stressors on Livestock
  • Generative Adversarial Networks and Image Synthesis
  • Anomaly Detection Techniques and Applications
  • Neuroscience and Neural Engineering
  • Industrial Vision Systems and Defect Detection
  • Blind Source Separation Techniques
  • Non-Destructive Testing Techniques
  • Tactile and Sensory Interactions
  • Face Recognition and Perception
  • Dementia and Cognitive Impairment Research
  • Reinforcement Learning in Robotics
  • Age of Information Optimization
  • Neurobiology and Insect Physiology Research
  • Domain Adaptation and Few-Shot Learning
  • Olfactory and Sensory Function Studies
  • Visual Attention and Saliency Detection
  • Aesthetic Perception and Analysis
  • Machine Learning and ELM

Loughborough University
2021-2024

University of Amsterdam
2018-2024

University of Ulster
2019-2023

Netherlands Institute for Neuroscience
2019-2023

Intelligent Systems Research (United States)
2020

Nanyang Technological University
2015-2018

Amsterdam University of the Arts
2018

Intel (United States)
2014

Deep neural networks with rate-based neurons have exhibited tremendous progress in the last decade. However, same level of has not been observed research on spiking (SNN), despite their capability to handle temporal data, energy-efficiency and low latency. This could be because benchmarking techniques for SNNs are based methods used evaluating deep networks, which do provide a clear evaluation capabilities SNNs. Particularly, SNN approaches regards energy efficiency latency requires...

10.3390/bdcc5040067 article EN cc-by Big Data and Cognitive Computing 2021-11-15

Spiking neural networks (SNNs) mimic their biological counterparts more closely than predecessors and are considered the third generation of artificial networks. It has been proven that spiking neurons have a higher computational capacity lower power requirements sigmoidal This article introduces new type SNN draws inspiration incorporates concepts from neuronal assemblies in human brain. The proposed network, termed as class-dependent activation-based (CDNA-SNN), assigns each neuron...

10.1109/tnnls.2024.3353571 article EN IEEE Transactions on Neural Networks and Learning Systems 2024-02-08

This paper presents a two stage learning algorithm for Growing-Pruning Spiking Neural Network (GPSNN) pattern classification problems. The GPSNN uses three layered network architecture with input layer employing modified population coding and, leaky integrate-and-fire spiking neurons in the hidden and output layers. class label sample is determined according to neuron minimum spike latency. employs mechanism. In first stage, grown adapted map inputs hyperdimensional space. second low...

10.1109/ijcnn.2015.7280592 article EN 2022 International Joint Conference on Neural Networks (IJCNN) 2015-07-01

This paper presents a new learning algorithm developed for three layered spiking neural network pattern classification problems. The maximizes the interclass margin and is referred to as two stage maximization (TMM-SNN). In structure stage, completely evolves hidden layer neurons in first epoch. Further, TMM-SNN updates weights of multiple epochs using newly normalized membrane potential rule such that margins (based on response neurons) are maximized. considers both local information spike...

10.1109/tcyb.2018.2791282 article EN IEEE Transactions on Cybernetics 2018-01-23

Generative Adversarial Networks (GAN) have led to important advancements in generation of time-series data areas like speech processing. This ability GANs can be very useful for Brain-Computer Interfaces (BCIs) where collecting large number samples expensive and time-consuming. To address this issue, paper presents a new approach generating artificial electroencephalography (EEG) motor imagery. here use generator discriminator networks that consist Bidirectional Long Short Term Memory...

10.1109/ijcnn48605.2020.9206942 article EN 2022 International Joint Conference on Neural Networks (IJCNN) 2020-07-01

Recognising familiar places is a competence required in many engineering applications that interact with the real world such as robot navigation. Combining information from different sensory sources promotes robustness and accuracy of place recognition. However, mismatch data registration, dimensionality, timing between modalities remain challenging problems multisensory Spurious generated by sensor drop-out environments particularly problematic often resolved through adhoc brittle...

10.3389/frobt.2021.732023 article EN Frontiers in Robotics and AI 2021-12-13

Predictive coding provides a computational paradigm for modeling perceptual processing as the construction of representations accounting causes sensory inputs. Here, we developed scalable, deep network architecture predictive that is trained using gated Hebbian learning rule and mimics feedforward feedback connectivity cortex. After training on image datasets, models formed latent in higher areas allowed reconstruction original images. We analyzed low- high-level properties such orientation...

10.3389/fncom.2021.666131 article EN cc-by Frontiers in Computational Neuroscience 2021-07-28

Predictive coding (PC) is an influential theory in neuroscience, which suggests the existence of a cortical architecture that constantly generating and updating predictive representations sensory inputs. Owing to its hierarchical generative nature, PC has inspired many computational models perception literature. However, biological plausibility existing not been sufficiently explored due their use artificial neurons approximate neural activity with firing rates continuous time domain...

10.3389/fncom.2024.1338280 article EN cc-by Frontiers in Computational Neuroscience 2024-04-12

Abstract Predictive coding (PC) is an influential theory in neuroscience, which suggests the existence of a cortical architecture that constantly generating and updating predictive representations sensory inputs. Owing to its hierarchical generative nature, PC has inspired many computational models perception literature. However, biological plausibility existing not been sufficiently explored due their use artificial neural network features such as non-linear, continuous, clock-driven...

10.1101/2023.04.03.535317 preprint EN cc-by bioRxiv (Cold Spring Harbor Laboratory) 2023-04-03

Abstract In this article, we present a new approach to distinguish progressive mild cognitively impaired (pMCI) subjects, who eventually develop Alzheimer's disease (AD) from stable MCI (sMCI) subjects whose situation does not deteriorate into AD. The proposed combines the discriminating capabilities of classifiers and representation learning capacities autoencoders unified architecture, is hence termed as joint autoencoder classifier deep neural network (JACDNN). JACDNN employs single...

10.1002/ima.23054 article EN International Journal of Imaging Systems and Technology 2024-03-01

In this paper, we develop a new sequential learning algorithm for spiking neural network classifier. The handles the input features that are not in form of spike train but real-valued (analog) form. evolves number neuron automatically based on information present current sample and results compact architecture. Hence, it is referred to as Minimal Spiking Neural Network (MSNN). can either add or update parameters existing neurons contained arriving samples. rule uses excitatory/inhibitatory...

10.1109/ijcnn.2014.6889775 article EN 2022 International Joint Conference on Neural Networks (IJCNN) 2014-07-01

This paper proposes a Distributed Coding Spiking Neural Network (DC-SNN) with self-regulated learning algorithm to deal pattern classification problems. DC-SNN employs two hidden layers. First layer has receptive field neurons that convert the real-valued input features spike patterns and second LIF inhibitory interconnections. The been termed as distributed coding in rest of paper. interconnections will ensure each neuron this learns distinct from feature space. synaptic weights between...

10.1109/ijcnn48605.2020.9207620 article EN 2022 International Joint Conference on Neural Networks (IJCNN) 2020-07-01

Abstract Behavioral variability across individuals leads to substantial performance differences during cognitive tasks, although its neuronal origin and mechanisms remain elusive. Here we use recurrent neural networks trained on a multisensory decision-making task investigate inter-subject behavioral variability. By uniquely characterizing each network with random synaptic-weights initialization, observed large in the level of accuracy, bias decision speed these networks, mimicking...

10.1101/2023.10.28.564511 preprint EN cc-by bioRxiv (Cold Spring Harbor Laboratory) 2023-11-01

Abstract In this article, we developed an approach for detecting brain regions that contribute to Alzheimer's disease (AD) using support vector machine (SVM) classifiers and the recently self regulating particle swarm optimization (SRPSO) algorithm. SRPSO employs strategies inspired by principles of learning in humans achieve faster better results. The distinguishing subjects into AD patients cognitively normal (CN) individuals were built grey matter (GM) white (WM) volumetric features...

10.1002/ima.22458 article EN cc-by International Journal of Imaging Systems and Technology 2020-06-19

Abstract It has been argued that the brain is a prediction machine continuously learns how to make better predictions about stimuli received from external environment. builds model of world around us and uses this infer stimulus. Predictive coding proposed as mechanism through which might be able build such However, it not clear predictive can used deep neural network models while complying with architectural constraints imposed by brain. In paper, we describe an algorithm generative using...

10.1101/278218 preprint EN cc-by bioRxiv (Cold Spring Harbor Laboratory) 2018-03-07

This paper develops a new approach to estimate predicted class probabilities in deep Spiking Neural Networks (SNN) that encourages faster classification. The proposed utilizes the temporal separation between first spikes generated by output neurons which are then used with cross entropy loss for training network. maximizes neuron associated correct and other classes. Higher classification performance is obtained maximising tem-poral separation, also drives spike earlier simulation. As...

10.1109/ijcnn54540.2023.10191334 article EN 2022 International Joint Conference on Neural Networks (IJCNN) 2023-06-18

Spiking Neural Networks (SNNs), An alternative to sigmoidal neural networks, include time into their operations using discrete signals called spikes. Employing spikes enables SNNs mimic any feedforward network with lower power consumption. Recently a new type of SNN has been introduced for classification problems, known as Degree Belonging (DoB-SNN). DoB-SNN is two-layer spiking that shows significant potential an architecture and learning algorithm. This paper introduces variant...

10.1109/ijcnn55064.2022.9891925 article EN 2022 International Joint Conference on Neural Networks (IJCNN) 2022-07-18
Coming Soon ...