Martino Sorbaro

ORCID: 0000-0002-0182-7443
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Neural dynamics and brain function
  • Advanced Memory and Neural Computing
  • Neuroscience and Neural Engineering
  • Ferroelectric and Negative Capacitance Devices
  • Neural Networks and Reservoir Computing
  • Electrochemical Analysis and Applications
  • Neural Networks and Applications
  • stochastic dynamics and bifurcation
  • Multimodal Machine Learning Applications
  • Adversarial Robustness in Machine Learning
  • Domain Adaptation and Few-Shot Learning
  • Receptor Mechanisms and Signaling
  • CCD and CMOS Imaging Sensors
  • Protein Structure and Dynamics
  • Neonatal and fetal brain pathology
  • Physical Unclonable Functions (PUFs) and Hardware Security
  • Artificial Intelligence in Healthcare

SIB Swiss Institute of Bioinformatics
2019-2023

University of Zurich
2019-2023

ETH Zurich
2019-2023

University of Edinburgh
2016-2019

KTH Royal Institute of Technology
2016-2017

We present a method for automated spike sorting recordings with high-density, large-scale multielectrode arrays. Exploiting the dense sampling of single neurons by multiple electrodes, an efficient, low-dimensional representation detected spikes consisting estimated spatial locations and dominant shape features is exploited fast reliable clustering into units. Millions events can be sorted in minutes, parallelized scales better than quadratically number spikes. Performance demonstrated using...

10.1016/j.celrep.2017.02.038 article EN cc-by Cell Reports 2017-03-01

In the last few years, spiking neural networks (SNNs) have been demonstrated to perform on par with regular convolutional networks. Several works proposed methods convert a pre-trained CNN Spiking without significant sacrifice of performance. We demonstrate first that quantization-aware training CNNs leads better accuracy in SNNs. One benefits converting is leverage sparse computation SNNs and consequently equivalent at lower energy consumption. Here we propose an optimization strategy train...

10.3389/fnins.2020.00662 article EN cc-by Frontiers in Neuroscience 2020-06-30

The ability to sequentially learn multiple tasks without forgetting is a key skill of biological brains, whereas it represents major challenge the field deep learning. To avoid catastrophic forgetting, various continual learning (CL) approaches have been devised. However, these usually require discrete task boundaries. This requirement seems biologically implausible and often limits application CL methods in real world where are not always well defined. Here, we take inspiration from...

10.1007/s00422-023-00973-w article EN cc-by Biological Cybernetics 2023-08-17

"Forward-only" algorithms, which train neural networks while avoiding a backward pass, have recently gained attention as way of solving the biologically unrealistic aspects backpropagation. Here, we first address compelling challenges related to "forward-only" rules, include reducing performance gap with backpropagation and providing an analytical understanding their dynamics. To this end, show that forward-only algorithm top-down feedback is well-approximated by...

10.48550/arxiv.2302.05440 preprint EN other-oa arXiv (Cornell University) 2023-01-01

Event-based dynamic vision sensors provide very sparse output in the form of spikes, which makes them suitable for low-power applications. Convolutional spiking neural networks model such event-based data and develop their full energy-saving potential when deployed on asynchronous neuromorphic hardware. being a nascent field, sensitivity to potentially malicious adversarial attacks has received little attention so far. We show how white-box attack algorithms can be adapted discrete nature...

10.3389/fnins.2022.1068193 article EN cc-by Frontiers in Neuroscience 2022-12-22

In this work we explore encoding strategies learned by statistical models of sensory coding in noisy spiking networks. Early stages communication neural systems can be viewed as channels the information-theoretic sense. However, populations face constraints not commonly considered communications theory. Using restricted Boltzmann machines a model encoding, find that networks with sufficient capacity learn to balance precision and noise-robustness order adaptively communicate stimuli varying...

10.3390/e22070714 article EN cc-by Entropy 2020-06-28

Abstract A new method for automated spike sorting recordings with high density, large scale multielectrode arrays is presented. Exploiting the dense sampling of single neurons by multiple electrodes, we obtain an efficient, low-dimensional representation detected spikes consisting estimated spatial locations and dominant shape features, which enables fast reliable clustering into units. Millions events can be sorted in minutes, parallelized scales better than quadratically number spikes. We...

10.1101/048645 preprint EN cc-by-nd bioRxiv (Cold Spring Harbor Laboratory) 2016-04-13

Hebbian synaptic plasticity inevitably leads to interference and forgetting when different, overlapping memory patterns are sequentially stored in the same network. Recent work on artificial neural networks shows that an information-geometric approach can be used protect important weights slow down forgetting. This strategy however is biologically implausible as it requires knowledge of history previously learned patterns. In this work, we show a purely local weight consolidation mechanism,...

10.48550/arxiv.1807.05097 preprint EN other-oa arXiv (Cornell University) 2018-01-01

In the last few years, spiking neural networks have been demonstrated to perform on par with regular convolutional networks. Several works proposed methods convert a pre-trained CNN Spiking without significant sacrifice of performance. We demonstrate first that quantization-aware training CNNs leads better accuracy in SNNs. One benefits converting is leverage sparse computation SNNs and consequently equivalent at lower energy consumption. Here we propose an efficient optimization strategy...

10.48550/arxiv.1912.01268 preprint EN cc-by arXiv (Cornell University) 2019-01-01

Abstract Many neural computations emerge from self-sustained patterns of activity in recurrent circuits, which rely on balanced excitation and inhibition. Neuromorphic electronic circuits that use the physics silicon to emulate neuronal dynamics represent a promising approach for implementing brain’s computational primitives, including activity. However, achieving same robustness biological networks neuromorphic computing systems remains challenge, due high degree heterogeneity variability...

10.1101/2023.08.14.553298 preprint EN bioRxiv (Cold Spring Harbor Laboratory) 2023-08-16

Abstract Many neural computations emerge from self-sustained patterns of activity in recurrent circuits, which rely on balanced excitation and inhibition. Neuromorphic electronic circuits that use the physics silicon to emulate neuronal dynamics represent a promising approach for implementing brain's computational primitives, including activity. However, achieving same robustness biological networks neuromorphic computing systems remains challenge, due high degree heterogeneity variability...

10.21203/rs.3.rs-3449716/v1 preprint EN cc-by Research Square (Research Square) 2023-10-20

In this overview, we discuss the connections between observations of critical dynamics in neuronal networks and maximum entropy models that are often used as statistical neural activity, focusing particular on relation "statistical" "dynamical" criticality. We present examples systems one way, but not other, exemplifying thus difference two concepts. then emergence Zipf laws verifying their presence retinal activity under a number different conditions. second part chapter review criticality...

10.48550/arxiv.1812.09123 preprint EN cc-by-nc-sa arXiv (Cornell University) 2018-01-01

Reliable spike detection and sorting, the process of assigning each detected to its originating neuron, is an essential step in analysis extracellular electrical recordings from neurons. The volume complexity data recently developed large scale, high density microelectrode arrays probes, which allow recording thousands channels simultaneously, substantially complicate this task conceptually computationally. This chapter provides a summary discussion methods tackle these challenges, discuss...

10.48550/arxiv.1809.01051 preprint EN other-oa arXiv (Cornell University) 2018-01-01

The ability to sequentially learn multiple tasks without forgetting is a key skill of biological brains, whereas it represents major challenge the field deep learning. To avoid catastrophic forgetting, various continual learning (CL) approaches have been devised. However, these usually require discrete task boundaries. This requirement seems biologically implausible and often limits application CL methods in real world where are not always well defined. Here, we take inspiration from...

10.48550/arxiv.2212.04316 preprint EN cc-by-nc-sa arXiv (Cornell University) 2022-01-01

Event-based dynamic vision sensors provide very sparse output in the form of spikes, which makes them suitable for low-power applications. Convolutional spiking neural networks model such event-based data and develop their full energy-saving potential when deployed on asynchronous neuromorphic hardware. being a nascent field, sensitivity to potentially malicious adversarial attacks has received little attention so far. We show how white-box attack algorithms can be adapted discrete nature...

10.48550/arxiv.2110.02929 preprint EN cc-by arXiv (Cornell University) 2021-01-01
Coming Soon ...