Gopalakrishnan Srinivasan

ORCID: 0000-0003-2015-8545
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Advanced Memory and Neural Computing
  • Ferroelectric and Negative Capacitance Devices
  • Neural dynamics and brain function
  • Neural Networks and Reservoir Computing
  • Advanced Neural Network Applications
  • Renal cell carcinoma treatment
  • Cancer Genomics and Diagnostics
  • Advanced Image and Video Retrieval Techniques
  • Medical Image Segmentation Techniques
  • Machine Learning and ELM
  • Economic and Financial Impacts of Cancer
  • Domain Adaptation and Few-Shot Learning
  • Low-power high-performance VLSI design
  • Renal and related cancers
  • Medical Imaging Techniques and Applications
  • Genetic factors in colorectal cancer
  • Image and Object Detection Techniques
  • Neuroscience and Neural Engineering
  • Pulmonary Hypertension Research and Treatments
  • Topic Modeling
  • IoT and Edge/Fog Computing
  • Colorectal Cancer Treatments and Studies
  • Advanced MRI Techniques and Applications
  • Neural Networks and Applications
  • Water Quality Monitoring Technologies

East Suffolk and North Essex NHS Foundation Trust
2020-2024

National Health Service
2020-2024

Mid Essex Hospital Services NHS Trust
2017-2023

Purdue University West Lafayette
2016-2021

R.V. College of Engineering
2020-2021

Broomfield Hospital
2017-2020

D A Pandu Memorial RV Dental College and Hospital
2020

Bharathiar University
2019

Visvesvaraya Technological University
2008

National Institutes of Health
1997-2002

Spiking Neural Networks (SNNs) have recently emerged as a prominent neural computing paradigm. However, the typical shallow SNN architectures limited capacity for expressing complex representations, while training deep SNNs using input spikes has not been successful so far. Diverse methods proposed to get around this issue such converting off-line trained Artificial (ANNs) SNNs. ANN-SNN conversion scheme fails capture temporal dynamics of spiking system. On other hand, it is still difficult...

10.3389/fnins.2020.00119 article EN cc-by Frontiers in Neuroscience 2020-02-28

Spiking Neural Networks (SNNs) have recently attracted significant research interest as the third generation of artificial neural networks that can enable low-power event-driven data analytics. The best performing SNNs for image recognition tasks are obtained by converting a trained Analog Network (ANN), consisting Rectified Linear Units (ReLU), to SNN composed integrate-and-fire neurons with "proper" firing thresholds. converted typically incur loss in accuracy compared provided original...

10.1109/cvpr42600.2020.01357 article EN 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2020-06-01

Abstract Spiking Neural Networks (SNNs) have emerged as a powerful neuromorphic computing paradigm to carry out classification and recognition tasks. Nevertheless, the general purpose platforms custom hardware architectures implemented using standard CMOS technology, been unable rival power efficiency of human brain. Hence, there is need for novel nanoelectronic devices that can efficiently model neurons synapses constituting an SNN. In this work, we propose heterostructure composed Magnetic...

10.1038/srep29545 article EN cc-by Scientific Reports 2016-07-13

Spiking Neural Networks (SNNs) are fast becoming a promising candidate for brain-inspired neuromorphic computing because of their inherent power efficiency and impressive inference accuracy across several cognitive tasks such as image classification speech recognition. The recent efforts in SNNs have been focused on implementing deeper networks with multiple hidden layers to incorporate exponentially more difficult functional representations. In this paper, we propose pre-training scheme...

10.3389/fnins.2018.00435 article EN cc-by Frontiers in Neuroscience 2018-08-03

10.5281/zenodo.3607779 article EN cc-by Zenodo (CERN European Organization for Nuclear Research) 2019-02-10

Spiking Neural Networks (SNNs) operate with asynchronous discrete events (or spikes) which can potentially lead to higher energy-efficiency in neuromorphic hardware implementations. Many works have shown that an SNN for inference be formed by copying the weights from a trained Artificial Network (ANN) and setting firing threshold each layer as maximum input received layer. These type of converted SNNs require large number time steps achieve competitive accuracy diminishes energy savings. The...

10.48550/arxiv.2005.01807 preprint EN other-oa arXiv (Cornell University) 2020-01-01

in the epidemiology and clinical association of sickle cell disease with malaria, bacterial viral infections (including SARS-CoV-2), suggests that should be included Integrated Management Childhood Illness programme to improve outcomes.Provision for diagnosis treatment incorporated into national health systems programming, an emphasis on delivering services primary care setting.COVID-19 is expected herald a global economic recession might result contraction international funding development...

10.1016/s2352-3026(20)30123-x article EN other-oa The Lancet Haematology 2020-04-24

Metastatic papillary renal cancer (PRC) has poor outcomes, and new treatments are required. There is a strong rationale for investigating mesenchymal epithelial transition receptor (MET) programmed cell death ligand-1 (PD-L1) inhibition in this disease. In study, the combination of savolitinib (MET inhibitor) durvalumab (PD-L1 investigated.This single-arm phase II trial explored (1,500 mg once every four weeks) (600 daily; ClinicalTrials.gov identifier: NCT02819596). Treatment-naïve or...

10.1200/jco.22.01414 article EN Journal of Clinical Oncology 2023-02-21
Janet E. Brown Kara‐Louise Royle Walter M. Gregory Christy Ralph Anthony Maraveyas and 94 more Omar Din Timothy Eisen Paul Nathan Tom Powles Richard Griffiths Robert J. Jones Naveen Vasudev Matthew Wheater Abdel Hamid Tom Waddell R. McMenemin Poulam M. Patel James Larkin Guy Faust Adam Martin Jayne Swain Janine Bestall Christopher McCabe David Meads Vicky Goh Tze Min Wah Julia Brown Jenny Hewison Peter J. Selby Fiona Collinson Judith Carser Gopalakrishnan Srinivasan Fiona Thistlewaite Ashraf Azzabi Mark Beresford David Farrugia M. Decatris Carys Thomas Joanna Gale James J McAleer Alison Clayton Ekaterini Boleti T. Geldart Santhanam Sundar J.F. Lester Nachi Palaniappan Mohan Hingorani Khaliq Rehman Mohammad Adil Khan Naveed Sarwar Janine Graham Alastair Thomson Narayanan Srihari Denise Sheehan R. Srinivasan Omar Khan Andrew Stockdale Jane Worlding Stergios Boussios N Stuart Carey MacDonald-Smith Falalu Danwata Duncan McLaren Aravindhan Sundaramurthy Anna Lydon S. Beesley Kathryn Lees Mohini Varughese Emma Gray Angela C Scott Mark Baxter Anna Mullard Pasquale F. Innominato Gaurav Kapur Anil Kumar Natalie Charnley Caroline Manetta Prabir Chakraborti Prantik Das Sarah Rudman Henry F. Taylor Christos Mikropoulos Martin Highley D. Muthukumar Anjali Zarkar Roy Vergis Seshadri Sriprasad Patryk Brulinski Amanda Clarke Richard Osbourne Melanie Harvey Renata Dega Geoffrey Sparrow Urmila Barthakur Erica Beaumont Caroline Manetta Agnieszka Michael Emilio Porfiri Faisal Azam Ravi Kodavtiganti

10.1016/s1470-2045(22)00793-8 article EN cc-by The Lancet Oncology 2023-02-13

Spiking neural networks (SNNs) have emerged as a promising brain inspired neuromorphic-computing paradigm for cognitive system design due to their inherent event-driven processing capability. The fully connected (FC) shallow SNNs typically used pattern recognition require large number of trainable parameters achieve competitive classification accuracy. In this paper, we propose deep spiking convolutional network (SpiCNN) composed hierarchy stacked layers followed by spatial-pooling layer and...

10.1109/tcds.2018.2833071 article EN publisher-specific-oa IEEE Transactions on Cognitive and Developmental Systems 2018-05-04

Deep neural networks are biologically inspired class of algorithms that have recently demonstrated the state-of-the-art accuracy in large-scale classification and recognition tasks. Hardware acceleration deep is paramount importance to ensure their ubiquitous presence future computing platforms. Indeed, a major landmark enables efficient hardware accelerators for recent advances from machine learning community viability aggressively scaled binary networks. In this paper, we demonstrate how...

10.1109/tcsi.2019.2907488 article EN publisher-specific-oa IEEE Transactions on Circuits and Systems I Regular Papers 2019-04-24

The efficiency of the human brain in performing classification tasks has attracted considerable research interest brain-inspired neuromorphic computing. Hardware implementations a system aims to mimic computations through interconnection neurons and synaptic weights. A leaky-integrate-fire (LIF) spiking model is widely used emulate dynamics neuronal action potentials. In this work, we propose spin based LIF neuron using magneto-electric (ME) switching ferro-magnets. voltage across ME oxide...

10.1109/ted.2017.2671353 article EN publisher-specific-oa IEEE Transactions on Electron Devices 2017-03-01

In this work, we propose ReStoCNet, a residual stochastic multilayer convolutional Spiking Neural Network (SNN) composed of binary kernels, to reduce the synaptic memory footprint and enhance computational efficiency SNNs for complex pattern recognition tasks. ReStoCNet consists an input layer followed by stacked layers hierarchical feature extraction, pooling dimensionality reduction, fully-connected inference. addition, introduce connections between improve learning capability deep SNNs....

10.3389/fnins.2019.00189 article EN cc-by Frontiers in Neuroscience 2019-03-19

Trees are used by animals, humans and machines to classify information make decisions. Natural tree structures displayed synapses of the brain involves potentiation depression capable branching is essential for survival learning. Demonstration such features in synthetic matter challenging due need host a complex energy landscape learning, memory electrical interrogation. We report experimental realization tree-like conductance states at room temperature strongly correlated perovskite...

10.1038/s41467-020-16105-y article EN cc-by Nature Communications 2020-05-07

Liquid state machine (LSM), a bio-inspired computing model consisting of the input sparsely connected to randomly interlinked reservoir (or liquid) spiking neurons followed by readout layer, finds utility in range applications varying from robot control and sequence generation action, speech, image recognition. LSMs stand out among other Recurrent Neural Network (RNN) architectures due their simplistic structure lower training complexity. Plethora recent efforts have been focused towards...

10.3389/fnins.2019.00504 article EN cc-by Frontiers in Neuroscience 2019-05-28

545 Background: Metastatic papillary renal cancer (PRC) has poor outcomes and there is need for new treatments. There a strong rationale investigating MET PD-L1 inhibition in this disease. In study, we investigate savolitinib (MET inhibitor) durvalumab (PD-L1 together. Methods: This single arm phase I/II trial explored at starting doses of 1500mg Q4W 600mg OD respectively, with 4wk run-in. Treatment naïve or previously treated patients metastatic PRC were included. Response rate (RR) (RECIST...

10.1200/jco.2019.37.7_suppl.545 article EN Journal of Clinical Oncology 2019-03-01

Neuromorphic algorithms are being increasingly deployed across the entire computing spectrum from data centers to mobile and wearable devices solve problems involving recognition, analytics, search inference. For example, large-scale artificial neural networks (popularly called deep learning) now represent state-of-the art in a wide ever-increasing range of video/image/audio/text recognition problems. However, growth sets network complexities have led learning becoming one most challenging...

10.1145/2897937.2905009 article EN 2016-05-25

Biologically-inspired spiking neural networks (SNNs) have attracted significant research interest due to their inherent computational efficiency in performing classification and recognition tasks. The conventional CMOS-based implementations of large-scale SNNs are power intensive. This is a consequence the fundamental mismatch between technology used realize neurons synapses, neuroscience mechanisms governing operation, leading area-expensive circuit designs. In this work, we present...

10.23919/date.2017.7927045 article EN Design, Automation & Test in Europe Conference & Exhibition (DATE), 2015 2017-03-01

Deep neural networks (DNNs) have emerged as the state-of-the-art technique in a wide range of machine learning tasks for analytics and computer vision next generation embedded (mobile, IoT, wearable) devices. Despite their success, they suffer from high energy requirements. In recent years, inherent error resiliency DNNs has been exploited by introducing approximations at either algorithmic or hardware levels (individually) to obtain savings while incurring tolerable accuracy degradation....

10.1109/jetcas.2018.2835809 article EN publisher-specific-oa IEEE Journal on Emerging and Selected Topics in Circuits and Systems 2018-05-14

In this work, we propose a Spiking Neural Network (SNN) consisting of input neurons sparsely connected by plastic synapses to randomly interlinked liquid, referred as Liquid-SNN, for unsupervised speech and image recognition. We adapt the strength interconnecting liquid using Spike Timing Dependent Plasticity (STDP), which enables self-learn general representation unique classes patterns. The presented learning methodology makes it possible infer class test directly neuronal spiking...

10.3389/fnins.2018.00524 article EN cc-by Frontiers in Neuroscience 2018-08-23

Brain-inspired learning models attempt to mimic the computations performed in neurons and synapses constituting human brain achieve its efficiency cognitive tasks. In this work, we propose Spike Timing Dependent Plasticity-based unsupervised feature using convolution-over-time Spiking Neural Network (SNN). We use shared weight kernels that are convolved with input patterns over time encode representative features, thereby improving sparsity as well robustness of model. show Convolutional SNN...

10.1145/3266229 article EN ACM Journal on Emerging Technologies in Computing Systems 2018-10-31

Multilayered artificial neural networks have found widespread utility in classification and recognition applications. The scale complexity of such together with the inadequacies general purpose computing platforms led to a significant interest development efficient hardware implementations. In this work, we focus on designing energy-efficient on-chip storage for synaptic weights, motivated primarily by observation that number synapses is orders magnitude larger than neurons. Typical digital...

10.3850/9783981537079_0909 article EN 2016-01-01

In this work, we propose stochastic Binary Spiking Neural Network (sBSNN) composed of spiking neurons and binary synapses (stochastic only during training) that computes probabilistically with one-bit precision for power-efficient memory-compressed neuromorphic computing. We present an energy-efficient implementation the proposed sBSNN using `stochastic bit' as core computational primitive to realize synapses, which are fabricated in 90nm CMOS process, achieve efficient on-chip training...

10.1109/tcsi.2020.2979826 article EN publisher-specific-oa IEEE Transactions on Circuits and Systems I Regular Papers 2020-03-17
Coming Soon ...