Simone Scardapane

ORCID: 0000-0003-0881-8344
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Neural Networks and Applications
  • Machine Learning and ELM
  • Domain Adaptation and Few-Shot Learning
  • Speech and Audio Processing
  • Advanced Adaptive Filtering Techniques
  • Advanced Graph Neural Networks
  • Neural Networks and Reservoir Computing
  • Advanced Neural Network Applications
  • Multimodal Machine Learning Applications
  • Advanced Memory and Neural Computing
  • Adversarial Robustness in Machine Learning
  • Model Reduction and Neural Networks
  • Music and Audio Processing
  • Explainable Artificial Intelligence (XAI)
  • Blind Source Separation Techniques
  • Sparse and Compressive Sensing Techniques
  • Anomaly Detection Techniques and Applications
  • Stochastic Gradient Optimization Techniques
  • Remote-Sensing Image Classification
  • Privacy-Preserving Technologies in Data
  • Human Pose and Action Recognition
  • Face and Expression Recognition
  • Image and Signal Denoising Methods
  • Energy Load and Power Forecasting
  • Generative Adversarial Networks and Image Synthesis

Sapienza University of Rome
2016-2025

Consorzio Nazionale Interuniversitario per le Telecomunicazioni
2024

Birla Institute of Technology and Science, Pilani
2023

Neural networks, as powerful tools for data mining and knowledge engineering, can learn from to build feature‐based classifiers nonlinear predictive models. Training neural networks involves the optimization of nonconvex objective functions, usually, learning process is costly infeasible applications associated with streams. A possible, albeit counterintuitive, alternative randomly assign a subset networks’ weights so that resulting task be formulated linear least‐squares problem. This...

10.1002/widm.1200 article EN Wiley Interdisciplinary Reviews Data Mining and Knowledge Discovery 2017-02-09

Classification of multivariate time series (MTS) has been tackled with a large variety methodologies and applied to wide range scenarios. Reservoir computing (RC) provides efficient tools generate vectorial, fixed-size representation the MTS that can be further processed by standard classifiers. Despite their unrivaled training speed, classifiers based on RC architecture fail achieve same accuracy fully trainable neural networks. In this article, we introduce reservoir model space, an...

10.1109/tnnls.2020.3001377 article EN IEEE Transactions on Neural Networks and Learning Systems 2020-06-29

The extreme learning machine (ELM) was recently proposed as a unifying framework for different families of algorithms. classical ELM model consists linear combination fixed number nonlinear expansions the input vector. Learning in is hence equivalent to finding optimal weights that minimize error on dataset. update works batch mode, either with explicit feature mappings or implicit defined by kernels. Although an online version has been former, no work done up this point latter, and whether...

10.1109/tnnls.2014.2382094 article EN IEEE Transactions on Neural Networks and Learning Systems 2014-12-31

Learning continually from non-stationary data streams is a long-standing goal and challenging problem in machine learning. Recently, we have witnessed renewed fast-growing interest continual learning, especially within the deep learning community. However, algorithmic solutions are often difficult to re-implement, evaluate port across different settings, where even results on standard benchmarks hard reproduce. In this work, propose Avalanche, an open-source end-to-end library for research...

10.1109/cvprw53098.2021.00399 article EN 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) 2021-06-01

Complex-valued neural networks (CVNNs) are a powerful modeling tool for domains where data can be naturally interpreted in terms of complex numbers. However, several analytical properties the domain (such as holomorphicity) make design CVNNs more challenging task than their real counterpart. In this paper, we consider problem flexible activation functions (AFs) domain, i.e., AFs endowed with sufficient degrees freedom to adapt shape given training data. While has received considerable...

10.1109/tetci.2018.2872600 article EN IEEE Transactions on Emerging Topics in Computational Intelligence 2018-10-17

Graph convolutional networks (GCNs) are a family of neural network models that perform inference on graph data by interleaving vertexwise operations and message-passing exchanges across nodes. Concerning the latter, two key questions arise: 1) how to design differentiable exchange protocol (e.g., one-hop Laplacian smoothing in original GCN) 2) characterize tradeoff complexity with respect local updates. In this brief, we show state-of-the-art results can be achieved adapting number...

10.1109/tnnls.2020.3025110 article EN IEEE Transactions on Neural Networks and Learning Systems 2020-09-25

Abstract Objective: Deep learning tools applied to high-resolution neurophysiological data have significantly progressed, offering enhanced decoding, real-time processing, and readability for practical applications. However, the design of artificial neural networks analyze activity in vivo remains a challenge, requiring delicate balance between efficiency low-data regimes interpretability results.
Approach:
To address this we introduce novel specialized transformer...

10.1088/1741-2552/adaef0 article EN Journal of Neural Engineering 2025-01-27

Graph representation learning has become a ubiquitous component in many scenarios, ranging from social network analysis to energy forecasting smart grids. In several applications, ensuring the fairness of node (or graph) representations with respect some protected attributes is crucial for their correct deployment. Yet, graph deep remains under-explored, few solutions available. particular, tendency similar nodes cluster on real-world graphs (i.e., homophily) can dramatically worsen these...

10.1109/tai.2021.3133818 article EN IEEE Transactions on Artificial Intelligence 2021-12-09

In recent years, there has been a growing demand for renewable energy sources, which are inherently associated with decentralized distribution and dependent on weather conditions. Their management forecasting of produced tasks increasing complexity. Spatio-Temporal Graph Neural Networks have applied in this context excellent results, taking advantage the correct integration both topological data, defined by plants territory, temporal data time series. A drawback graph neural networks is...

10.1016/j.apenergy.2023.122151 article EN cc-by-nc-nd Applied Energy 2023-10-31

The modeling of human emotion expression in speech signals is an important, yet challenging task. high resource demand recognition models, combined with the general scarcity emotion-labelled data are obstacles to development and application effective solutions this field. In paper, we present approach jointly circumvent these difficulties. Our method, named RH-emo, a novel semi-supervised architecture aimed at extracting quaternion embeddings from real-valued monoaural spectrograms, enabling...

10.1109/taslp.2023.3250840 article EN cc-by IEEE/ACM Transactions on Audio Speech and Language Processing 2023-01-01

In the field of earth observation (EO), continual learning (CL) algorithms have been proposed to deal with large datasets by decomposing them into several subsets and processing incrementally. The majority these assume that data are, first, coming from a single source, second, fully labeled. Real-world EO are instead characterized heterogeneity (e.g., aerial, satellite, or drone scenarios), for most part they unlabeled, meaning can be exploited only through emerging self-supervised (SSL)...

10.1109/jstars.2023.3280029 article EN cc-by IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 2023-01-01

Learning from data in the quaternion domain enables us to exploit internal dependencies of 4D signals and treating them as a single entity. One models that perfectly suits with quaternion-valued processing is represented by 3D acoustic their spherical harmonics decomposition. In this paper, we address problem localizing detecting sound events spatial field using processing. particular, consider harmonic components captured first-order ambisonic microphone process convolutional neural...

10.1109/icassp.2019.8682711 article EN ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2019-04-17

Distributed learning refers to the problem of inferring a function when training data are distributed among different nodes. While significant work has been done in contexts supervised and unsupervised learning, intermediate case Semi-supervised setting received less attention. In this paper, we propose an algorithm for class problems, by extending framework manifold regularization. The main component proposed consists fully computation adjacency matrix patterns. To end, novel low-rank...

10.1109/tnnls.2016.2597444 article EN IEEE Transactions on Neural Networks and Learning Systems 2016-08-26
Coming Soon ...