- Advanced Memory and Neural Computing
- Neural dynamics and brain function
- Neural Networks and Reservoir Computing
- Ferroelectric and Negative Capacitance Devices
- Neural Networks and Applications
- Neuroscience and Neural Engineering
- Speech and Audio Processing
- Advanced Adaptive Filtering Techniques
- Hand Gesture Recognition Systems
- CCD and CMOS Imaging Sensors
- Seismic Imaging and Inversion Techniques
- Blind Source Separation Techniques
- Photoreceptor and optogenetics research
- Advanced Data Storage Technologies
- Electronic and Structural Properties of Oxides
- Sparse and Compressive Sensing Techniques
- Machine Learning and ELM
- Algorithms and Data Compression
Intel (United States)
2021-2025
University of California, Irvine
2024
Intel (United Kingdom)
2024
Mission College
2023
Institute for Infocomm Research
2020-2022
Agency for Science, Technology and Research
2015-2022
National University of Singapore
2019-2020
Temasek Life Sciences Laboratory
2020
Nanyang Technological University
2013-2018
Temasek Polytechnic
2017
Configuring deep Spiking Neural Networks (SNNs) is an exciting research avenue for low power spike event based computation. However, the generation function non-differentiable and therefore not directly compatible with standard error backpropagation algorithm. In this paper, we introduce a new general mechanism learning synaptic weights axonal delays which overcomes problem of non-differentiability uses temporal credit assignment policy backpropagating to preceding layers. We describe...
The biologically inspired spiking neurons used in neuromorphic computing are nonlinear filters with dynamic state variables—very different from the stateless neuron models deep learning. next version of Intel's research processor, Loihi 2, supports a wide range stateful fully programmable dynamics. Here we showcase advanced that can be to efficiently process streaming data simulation experiments on emulated 2 hardware. In one example, Resonate-and-Fire (RF) compute Short Time Fourier...
Hand gestures are a form of non-verbal communication used by individuals in conjunction with speech to communicate. Nowadays, the increasing use technology, hand-gesture recognition is considered be an important aspect Human-Machine Interaction (HMI), allowing machine capture and interpret user's intent respond accordingly. The ability discriminate human can help several applications such as assisted living, healthcare, neuro-rehabilitation, sports. Recently, multi-sensor data fusion...
Spiking Neural Networks (SNNs) are bio-inspired networks that process information conveyed as temporal spikes rather than numeric values. An example of a sensor providing such data is the event-camera. It only produces an event when pixel reports significant brightness change. Similarly, spiking neuron SNN spike whenever number occur within short period time. Due to their spike-based computational model, SNNs can output from event-based, asynchronous sensors without any pre-processing at...
Abstract A critical enabler for progress in neuromorphic computing research is the ability to transparently evaluate different solutions on important tasks and compare them state-of-the-art conventional solutions. The Intel Neuromorphic Deep Noise Suppression Challenge (Intel N-DNS Challenge), inspired by Microsoft DNS Challenge, tackles a ubiquitous commercially relevant task: real-time audio denoising. Audio denoising likely reap benefits of due its low-bandwidth, temporal nature relevance...
Loihi 2 is an asynchronous, brain-inspired research processor that generalizes several fundamental elements of neuromorphic architecture, such as stateful neuron models communicating with event-driven spikes, in order to address limitations the first generation Loihi. Here we explore and characterize some these generalizations, sigma-delta encapsulation, resonate-and-fire neurons, integer-valued applied standard video, audio, signal processing tasks. We find new approaches can provide orders...
We present the Surrogate-gradient Online Error-triggered Learning (SOEL) system for online few-shot learning on neuromorphic processors. The SOEL uses a combination of transfer and principles computational neuroscience deep learning. show that partially trained Spiking Neural Networks (SNNs) implemented hardware can rapidly adapt to new classes data within domain. updates trigger when an error occurs, enabling faster with fewer updates. Using gesture recognition as case study, we be used...
The field of neuromorphic computing holds great promise in terms advancing efficiency and capabilities by following brain-inspired principles. However, the rich diversity techniques employed research has resulted a lack clear standards for benchmarking, hindering effective evaluation advantages strengths methods compared to traditional deep-learning-based methods. This paper presents collaborative effort, bringing together members from academia industry, define benchmarks computing:...
Recent work suggests that synaptic plasticity dynamics in biological models of neurons and neuromorphic hardware are compatible with gradient-based learning [1]. Gradient-based requires iterating several times over a dataset, which is both time-consuming constrains the training samples to be independently identically distributed. This incompatible systems do not have boundaries between inference, such as hardware. One approach overcome these constraints transfer learning, where portion...
Linear recurrent neural networks enable powerful long-range sequence modeling with constant memory usage and time-per-token during inference. These architectures hold promise for streaming applications at the edge, but deployment in resource-constrained environments requires hardware-aware optimizations to minimize latency energy consumption. Unstructured sparsity offers a compelling solution, enabling substantial reductions compute requirements--when accelerated by compatible hardware...
Stability is a key issue during spiking neural network training using SpikeProp. The inherent nonlinearity of Spiking Neuron means that the learning manifold changes abruptly; therefore, we need to carefully choose steps at every instance. Other sources instability are external disturbances come along with sample as well internal arise due modeling imperfection. unstable scenario can be indirectly observed in form surges, which sudden increases cost and common occurrence SpikeProp training....
Spiking Neural Networks (SNNs) are a promising research paradigm for low power edge-based computing. Recent works in SNN backpropagation has enabled training of SNNs practical tasks. However, since spikes binary events time, standard loss formulations not directly compatible with spike output. As result, current limited to using mean-squared count. In this paper, we formulate the output probability interpretation from count measure and introduce spike-based negative log-likelihood which more...
Supervised Learning methods for Spiking Neural Network are either able to learn spike train a single neuron or first in multilayer feedforward connection setting. The group of learning do not use the computational benefits hidden layer whereas second exploit information transfer potential train. Although, there have been few efforts setting spiking neural networks, cost these increases when is considered long period. We present event based weight update strategy that pattern network and...
Delay learning in SpikeProp is useful because it eliminates the need of redundant synaptic connections a Spiking Neural Network (SNN). The delay enhancement to SpikeProp, however, also inherits complications present basic with weight update that obstruct process. To tackle these issues, we perform convergence analysis investigate conditions required assure during and propose an adaptive rate scheme based on it. comparison this method fixed implementation only methods via simulations various...
Surges during training process are a major obstacle in Spiking Neural Network (SNN) using Spike-Prop algorithm and its derivatives [1]. In this paper, we perform weight convergence analysis to understand the proper step size SpikeProp learning hence avoid surges process. Using results of analysis, propose an optimum adaptive rate each iteration which will yield suitable within bounds condition. The performance is compared with existing methods via several simulations. It observed that use...
This work introduces a neuromorphic chip featuring energy-efficient transfer learning capability using new rule, Delta- Spike Time Dependent Plasticity (STDP). Delta STDP enables the to leverage on its previous knowledge learn and solve different but related tasks at faster rate with limited number (few shots) of training samples. Compared unsupervised on-chip (OCL) rules, offers lower overheads as it has ~20% memory utilization is operated only last layer Spiking Neural Network (SNN). A...