Input correlations impede suppression of chaos and learning in balanced firing-rate networks.
FOS: Computer and information sciences
0301 basic medicine
Computer Science - Machine Learning
QH301-705.5
Models, Neurological
Action Potentials
FOS: Physical sciences
Machine Learning (cs.LG)
03 medical and health sciences
Learning
Biology (General)
Neurons
Computational Neuroscience
0303 health sciences
Disordered Systems and Neural Networks (cond-mat.dis-nn)
Condensed Matter - Disordered Systems and Neural Networks
Nonlinear Sciences - Chaotic Dynamics
Networks, dynamical systems
Quantitative Biology - Neurons and Cognition
FOS: Biological sciences
Neurons and Cognition (q-bio.NC)
Nerve Net
Chaotic Dynamics (nlin.CD)
Research Article
DOI:
10.48550/arxiv.2201.09916
Publication Date:
2022-12-05
AUTHORS (5)
ABSTRACT
Neural circuits exhibit complex activity patterns, both spontaneously and evoked by external stimuli. Information encoding and learning in neural circuits depend on how well time-varying stimuli can control spontaneous network activity. We show that in firing-rate networks in the balanced state, external control of recurrent dynamics, i.e., the suppression of internally-generated chaotic variability, strongly depends on correlations in the input. A distinctive feature of balanced networks is that, because common external input is dynamically canceled by recurrent feedback, it is far more difficult to suppress chaos with common input into each neuron than through independent input. To study this phenomenon, we develop a non-stationary dynamic mean-field theory for driven networks. The theory explains how the activity statistics and the largest Lyapunov exponent depend on the frequency and amplitude of the input, recurrent coupling strength, and network size, for both common and independent input. We further show that uncorrelated inputs facilitate learning in balanced networks.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....