Sumit Bam Shrestha

ORCID: 0000-0002-5741-8518
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Advanced Memory and Neural Computing
  • Neural dynamics and brain function
  • Neural Networks and Reservoir Computing
  • Ferroelectric and Negative Capacitance Devices
  • Neural Networks and Applications
  • Neuroscience and Neural Engineering
  • Speech and Audio Processing
  • Advanced Adaptive Filtering Techniques
  • Hand Gesture Recognition Systems
  • CCD and CMOS Imaging Sensors
  • Seismic Imaging and Inversion Techniques
  • Blind Source Separation Techniques
  • Photoreceptor and optogenetics research
  • Advanced Data Storage Technologies
  • Electronic and Structural Properties of Oxides
  • Sparse and Compressive Sensing Techniques
  • Machine Learning and ELM
  • Algorithms and Data Compression

Intel (United States)
2021-2025

University of California, Irvine
2024

Intel (United Kingdom)
2024

Mission College
2023

Institute for Infocomm Research
2020-2022

Agency for Science, Technology and Research
2015-2022

National University of Singapore
2019-2020

Temasek Life Sciences Laboratory
2020

Nanyang Technological University
2013-2018

Temasek Polytechnic
2017

Configuring deep Spiking Neural Networks (SNNs) is an exciting research avenue for low power spike event based computation. However, the generation function non-differentiable and therefore not directly compatible with standard error backpropagation algorithm. In this paper, we introduce a new general mechanism learning synaptic weights axonal delays which overcomes problem of non-differentiability uses temporal credit assignment policy backpropagating to preceding layers. We describe...

10.48550/arxiv.1810.08646 preprint EN other-oa arXiv (Cornell University) 2018-01-01

The biologically inspired spiking neurons used in neuromorphic computing are nonlinear filters with dynamic state variables—very different from the stateless neuron models deep learning. next version of Intel's research processor, Loihi 2, supports a wide range stateful fully programmable dynamics. Here we showcase advanced that can be to efficiently process streaming data simulation experiments on emulated 2 hardware. In one example, Resonate-and-Fire (RF) compute Short Time Fourier...

10.1109/sips52927.2021.00053 article EN 2021-10-01

Hand gestures are a form of non-verbal communication used by individuals in conjunction with speech to communicate. Nowadays, the increasing use technology, hand-gesture recognition is considered be an important aspect Human-Machine Interaction (HMI), allowing machine capture and interpret user's intent respond accordingly. The ability discriminate human can help several applications such as assisted living, healthcare, neuro-rehabilitation, sports. Recently, multi-sensor data fusion...

10.3389/fnins.2020.00637 article EN cc-by Frontiers in Neuroscience 2020-08-05

Spiking Neural Networks (SNNs) are bio-inspired networks that process information conveyed as temporal spikes rather than numeric values. An example of a sensor providing such data is the event-camera. It only produces an event when pixel reports significant brightness change. Similarly, spiking neuron SNN spike whenever number occur within short period time. Due to their spike-based computational model, SNNs can output from event-based, asynchronous sensors without any pre-processing at...

10.1109/icra40945.2020.9197133 article EN 2020-05-01

Abstract A critical enabler for progress in neuromorphic computing research is the ability to transparently evaluate different solutions on important tasks and compare them state-of-the-art conventional solutions. The Intel Neuromorphic Deep Noise Suppression Challenge (Intel N-DNS Challenge), inspired by Microsoft DNS Challenge, tackles a ubiquitous commercially relevant task: real-time audio denoising. Audio denoising likely reap benefits of due its low-bandwidth, temporal nature relevance...

10.1088/2634-4386/ace737 article EN cc-by Neuromorphic Computing and Engineering 2023-07-13

Loihi 2 is an asynchronous, brain-inspired research processor that generalizes several fundamental elements of neuromorphic architecture, such as stateful neuron models communicating with event-driven spikes, in order to address limitations the first generation Loihi. Here we explore and characterize some these generalizations, sigma-delta encapsulation, resonate-and-fire neurons, integer-valued applied standard video, audio, signal processing tasks. We find new approaches can provide orders...

10.1109/icassp48485.2024.10448003 article EN ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2024-03-18

We present the Surrogate-gradient Online Error-triggered Learning (SOEL) system for online few-shot learning on neuromorphic processors. The SOEL uses a combination of transfer and principles computational neuroscience deep learning. show that partially trained Spiking Neural Networks (SNNs) implemented hardware can rapidly adapt to new classes data within domain. updates trigger when an error occurs, enabling faster with fewer updates. Using gesture recognition as case study, we be used...

10.1109/jetcas.2020.3032058 article EN publisher-specific-oa IEEE Journal on Emerging and Selected Topics in Circuits and Systems 2020-10-21

The field of neuromorphic computing holds great promise in terms advancing efficiency and capabilities by following brain-inspired principles. However, the rich diversity techniques employed research has resulted a lack clear standards for benchmarking, hindering effective evaluation advantages strengths methods compared to traditional deep-learning-based methods. This paper presents collaborative effort, bringing together members from academia industry, define benchmarks computing:...

10.48550/arxiv.2304.04640 preprint EN cc-by arXiv (Cornell University) 2023-01-01

Recent work suggests that synaptic plasticity dynamics in biological models of neurons and neuromorphic hardware are compatible with gradient-based learning [1]. Gradient-based requires iterating several times over a dataset, which is both time-consuming constrains the training samples to be independently identically distributed. This incompatible systems do not have boundaries between inference, such as hardware. One approach overcome these constraints transfer learning, where portion...

10.1109/aicas48895.2020.9073948 article EN 2020-04-24

Linear recurrent neural networks enable powerful long-range sequence modeling with constant memory usage and time-per-token during inference. These architectures hold promise for streaming applications at the edge, but deployment in resource-constrained environments requires hardware-aware optimizations to minimize latency energy consumption. Unstructured sparsity offers a compelling solution, enabling substantial reductions compute requirements--when accelerated by compatible hardware...

10.48550/arxiv.2502.01330 preprint EN arXiv (Cornell University) 2025-02-03

Stability is a key issue during spiking neural network training using SpikeProp. The inherent nonlinearity of Spiking Neuron means that the learning manifold changes abruptly; therefore, we need to carefully choose steps at every instance. Other sources instability are external disturbances come along with sample as well internal arise due modeling imperfection. unstable scenario can be indirectly observed in form surges, which sudden increases cost and common occurrence SpikeProp training....

10.1109/tnnls.2017.2713125 article EN IEEE Transactions on Neural Networks and Learning Systems 2017-07-04

Spiking Neural Networks (SNNs) are a promising research paradigm for low power edge-based computing. Recent works in SNN backpropagation has enabled training of SNNs practical tasks. However, since spikes binary events time, standard loss formulations not directly compatible with spike output. As result, current limited to using mean-squared count. In this paper, we formulate the output probability interpretation from count measure and introduce spike-based negative log-likelihood which more...

10.1109/ijcnn55064.2022.9892379 article EN 2022 International Joint Conference on Neural Networks (IJCNN) 2022-07-18

10.1016/j.neunet.2016.10.011 article EN Neural Networks 2016-11-08

Supervised Learning methods for Spiking Neural Network are either able to learn spike train a single neuron or first in multilayer feedforward connection setting. The group of learning do not use the computational benefits hidden layer whereas second exploit information transfer potential train. Although, there have been few efforts setting spiking neural networks, cost these increases when is considered long period. We present event based weight update strategy that pattern network and...

10.1109/icmla.2016.0061 article EN 2021 20th IEEE International Conference on Machine Learning and Applications (ICMLA) 2016-12-01

Delay learning in SpikeProp is useful because it eliminates the need of redundant synaptic connections a Spiking Neural Network (SNN). The delay enhancement to SpikeProp, however, also inherits complications present basic with weight update that obstruct process. To tackle these issues, we perform convergence analysis investigate conditions required assure during and propose an adaptive rate scheme based on it. comparison this method fixed implementation only methods via simulations various...

10.1109/ijcnn.2016.7727209 article EN 2022 International Joint Conference on Neural Networks (IJCNN) 2016-07-01

Surges during training process are a major obstacle in Spiking Neural Network (SNN) using Spike-Prop algorithm and its derivatives [1]. In this paper, we perform weight convergence analysis to understand the proper step size SpikeProp learning hence avoid surges process. Using results of analysis, propose an optimum adaptive rate each iteration which will yield suitable within bounds condition. The performance is compared with existing methods via several simulations. It observed that use...

10.1109/allerton.2013.6736567 article EN 2013-10-01

This work introduces a neuromorphic chip featuring energy-efficient transfer learning capability using new rule, Delta- Spike Time Dependent Plasticity (STDP). Delta STDP enables the to leverage on its previous knowledge learn and solve different but related tasks at faster rate with limited number (few shots) of training samples. Compared unsupervised on-chip (OCL) rules, offers lower overheads as it has ~20% memory utilization is operated only last layer Spiking Neural Network (SNN). A...

10.1109/esscirc53450.2021.9567782 article EN ESSCIRC 2022- IEEE 48th European Solid State Circuits Conference (ESSCIRC) 2021-09-13
Coming Soon ...