Flavio Martinelli

ORCID: 0009-0007-1514-0718
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Advanced Memory and Neural Computing
  • Neural Networks and Reservoir Computing
  • Neural Networks and Applications
  • Neural dynamics and brain function
  • Stochastic Gradient Optimization Techniques
  • Adversarial Robustness in Machine Learning
  • EEG and Brain-Computer Interfaces
  • Speech and Audio Processing
  • Model Reduction and Neural Networks
  • Machine Learning in Materials Science

IBM Research - Zurich
2024

ETH Zurich
2024

École Polytechnique Fédérale de Lausanne
2020-2024

Recent advances in Voice Activity Detection (VAD) are driven by artificial and Recurrent Neural Networks (RNNs), however, using a VAD system battery-operated devices requires further power efficiency. This can be achieved neuromorphic hardware, which enables Spiking (SNNs) to perform inference at very low energy consumption. networks characterized their ability process information efficiently, sparse cascade of binary events time called spikes. However, big performance gap separates from...

10.1109/icassp40776.2020.9053412 preprint EN ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2020-04-09

<title>Abstract</title> Advancements in memristive devices have given rise to a new generation of specialized hardware for bio-inspired computing. However, the majority these implementations only draw partial inspiration from architecture and functionalities mammalian brain. Moreover, use is typically restricted specific elements within learning algorithm, leaving computationally expensive operations be executed software. Here, we demonstrate actor-critic temporal difference (TD) on analogue...

10.21203/rs.3.rs-3993700/v1 preprint EN cc-by Research Square (Research Square) 2024-03-21

Advances of deep learning for Artificial Neural Networks(ANNs) have led to significant improvements in the performance digital signal processing systems implemented on chips. Although recent progress low-power chips is remarkable, neuromorphic that run Spiking Networks (SNNs) based applications offer an even lower power consumption, as a consequence ensuing sparse spike-based coding scheme. In this work, we develop SNN-based Voice Activity Detection (VAD) system belongs building blocks any...

10.1109/icassp40776.2020.9054761 article EN ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2020-04-09

MLPGradientFlow is a software package to solve numerically the gradient flow differential equation $\dot \theta = -\nabla \mathcal L(\theta; D)$, where $\theta$ are parameters of multi-layer perceptron, $\mathcal D$ some data set, and $\nabla L$ loss function. We show that adaptive first- or higher-order integration methods based on Runge-Kutta schemes have better accuracy convergence speed than descent with Adam optimizer. However, we find Newton's method approximations like BFGS preferable...

10.48550/arxiv.2301.10638 preprint EN cc-by arXiv (Cornell University) 2023-01-01

Can we identify the parameters of a neural network by probing its input-output mapping? Usually, there is no unique solution because permutation, overparameterisation and activation function symmetries. Yet, show that incoming weight vector each neuron identifiable up to sign or scaling, depending on function. For all commonly used functions, our novel method 'Expand-and-Cluster' identifies size target in two phases: (i) relax non-convexity problem, train multiple student networks expanded...

10.48550/arxiv.2304.12794 preprint EN cc-by arXiv (Cornell University) 2023-01-01
Coming Soon ...