Common population codes produce extremely nonlinear neural manifolds

Robustness Data point
DOI: 10.1073/pnas.2305853120 Publication Date: 2023-09-21T17:40:38Z
ABSTRACT
Populations of neurons represent sensory, motor, and cognitive variables via patterns activity distributed across the population. The size population used to encode a variable is typically much greater than dimension itself, thus, corresponding neural occupies lower-dimensional subsets full set possible states. Given data with such structure, fundamental question asks how close low-dimensional lie linear subspace. linearity or nonlinearity structure reflects important computational features encoding, as robustness generalizability. Moreover, identifying underlies common analysis methods Principal Component Analysis (PCA). Here, we show that for drawn from many codes resulting point clouds manifolds are exceedingly nonlinear, best-fitting subspace growing at least exponentially true data. Consequently, like PCA fail dramatically underlying even in limit arbitrarily points no noise.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (101)
CITATIONS (10)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....