- Neural dynamics and brain function
- Advanced Memory and Neural Computing
- Neural Networks and Applications
- Functional Brain Connectivity Studies
- Memory and Neural Mechanisms
- Neuroscience and Neuropharmacology Research
- stochastic dynamics and bifurcation
- Gene Regulatory Network Analysis
- EEG and Brain-Computer Interfaces
- Neural and Behavioral Psychology Studies
- Photoreceptor and optogenetics research
- Error Correcting Code Techniques
- Lipid Membrane Structure and Behavior
- DNA and Biological Computing
- Cooperative Communication and Network Coding
- Neural Networks and Reservoir Computing
- Ferroelectric and Negative Capacitance Devices
- Sleep and Wakefulness Research
- Machine Learning and ELM
- Cognitive Science and Mapping
- Complex Network Analysis Techniques
- Visual perception and processing mechanisms
- Zebrafish Biomedical Research Applications
- Neuroscience and Neural Engineering
University of California, Davis
2020-2025
The University of Texas at Austin
2015-2020
New York University
2015-2017
Yale University
2011-2014
Neurons show diverse timescales, so that different parts of a network respond with disparate temporal dynamics. Such diversity is observed both when comparing timescales across brain areas and among cells within local populations; the underlying circuit mechanism remains unknown. We examine conditions under which spatially connectivity can produce such behavior. In linear network, are segregated if eigenvectors matrix localized to network. develop framework predict shapes eigenvectors....
Brain electric field potentials are dominated by an arrhythmic broadband signal, but the underlying mechanism is poorly understood. Here we propose that power spectra characterize recurrent neural networks of nodes (neurons or clusters neurons), endowed with effective balance between excitation and inhibition tuned to keep network on edge dynamical instability. These show a fast mode reflecting local dynamics slow emerging from distributed connections. Together, 2 modes produce similar those...
Populations of neurons represent sensory, motor, and cognitive variables via patterns activity distributed across the population. The size population used to encode a variable is typically much greater than dimension itself, thus, corresponding neural occupies lower-dimensional subsets full set possible states. Given data with such structure, fundamental question asks how close low-dimensional lie linear subspace. linearity or nonlinearity structure reflects important computational features...
Abstract We developed a large-scale dynamical model of the macaque neocortex based on recent quantitative connectivity data. A hierarchy timescales naturally emerges from this system: sensory areas show brief, transient responses to input (appropriate for processing), whereas association integrate inputs over time and exhibit persistent activity (suitable decision-making working memory). The displays multiple temporal hierarchies, as evidenced by contrasting visual somatosensory stimulation....
Hippocampal circuits in the brain enable two distinct cognitive functions: construction of spatial maps for navigation and storage sequential episodic memories. This dual role remains an enduring enigma. While there have been advances modeling representations hippocampus, we lack good models its memory. Here present a neocortical-entorhinal-hippocampal network model that implements high-capacity general associative memory, memory by factorizing content from dynamics generating...
Abstract The brain constructs distributed representations of key low-dimensional variables. These variables may be external stimuli or internal quantities relevant for survival, such as a sense one’s location in the world. We consider that high-dimensional population-level activity vectors are fundamental representational currency neural circuit, and these trace out manifold whose dimension topology matches those represented variable. This perspective — applied to mammalian head direction...
An elemental computation in the brain is to identify best a set of options and report its value. It required for inference, decision-making, optimization, action selection, consensus, foraging. Neural computing considered powerful because parallelism; however, it unclear whether neurons can perform this max-finding operation way that improves upon prohibitively slow optimal serial (which takes [Formula: see text] time N noisy candidate options) by factor N, benchmark parallel computation....
Abstract The power spectrum of brain electric field potential recordings is dominated by an arrhythmic broadband signal but a mechanistic account its underlying neural network dynamics lacking. Here we show how the can be explained simple random nodes near criticality. Such recurrent produces activity with combination fast and slow autocorrelation time constant, mode corresponding to local resulting from excitatory connections across network. These modes are combined produce similar that...
Intrinsic neural timescales characterize the dynamics of endogenous fluctuations in activity. We measured intrinsic prefrontal neurons and examined their changes during posterior parietal cortex (PPC) inactivation. Frontal eye field (FEF) within (PFC) exhibited a bimodal distribution timescales: short-timescale showed stronger transient visual responses, while long-timescale sustained modulation stimulus-driven attention. PPC inactivation increased both neuron types, with 15-fold greater...
The brain must robustly store a large number of memories, corresponding to the many events encountered over lifetime. However, memory states in existing neural network models either grows weakly with size or recall fails catastrophically vanishingly little noise. We construct an associative content-addressable exponentially stable and robust error-correction. possesses expander graph connectivity on restricted Boltzmann machine architecture. expansion property allows simple dynamics perform...
The human brain has immense learning capabilities at extreme energy efficiencies and scale that no artificial system been able to match. For decades, reverse engineering the one of top priorities science technology research. Despite numerous efforts, conventional electronics-based methods have failed match scalability, efficiency, self-supervised brain. On other hand, very recent progress in development new generations photonic electronic memristive materials, device technologies, 3D...
Chaos is generic in strongly-coupled recurrent networks of model neurons, and thought to be an easily accessible dynamical regime the brain. While neural chaos typically seen as impediment robust computation, we show how such might play a functional role allowing brain learn sample from generative models. We construct architectures that combine classic either with canonical modeling architecture or energy-based models memory. these have appealing properties for sampling, including easy...
Abstract Populations of neurons represent sensory, motor and cognitive variables via patterns activity distributed across the population. The size population used to encode a variable is typically much greater than dimension itself, thus corresponding neural occupies lower-dimensional subsets full set possible states. Given data with such structure, fundamental question asks how close low-dimensional lies linear subspace. linearity or non-linearity structure reflects important computational...
Identifying the maximal element (max,argmax) in a set is core computational inference, decision making, optimization, action selection, consensus, and foraging. Running sequentially through list of N fluctuating items takes log(N) time to accurately find max, prohibitively slow for large N. The power computation brain ascribed part its parallelism, yet it theoretically unclear whether leaky noisy neurons can perform distributed that cuts required serial by factor N, benchmark parallel...