Grace W. Lindsay

ORCID: 0000-0001-9904-7471
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Neural dynamics and brain function
  • Visual perception and processing mechanisms
  • Advanced Memory and Neural Computing
  • Cell Image Analysis Techniques
  • EEG and Brain-Computer Interfaces
  • Neural Networks and Applications
  • Neuroscience and Neuropharmacology Research
  • Explainable Artificial Intelligence (XAI)
  • CCD and CMOS Imaging Sensors
  • Functional Brain Connectivity Studies
  • Experimental Learning in Engineering
  • Memory and Neural Mechanisms
  • Genetics, Bioinformatics, and Biomedical Research
  • Cognitive Computing and Networks
  • Machine Learning and ELM
  • Machine Learning in Materials Science
  • Visual Attention and Saliency Detection
  • Extracellular vesicles in disease
  • Advanced Neural Network Applications
  • Cognitive Science and Mapping
  • Neural and Behavioral Psychology Studies
  • Philosophy and History of Science
  • Biomedical and Engineering Education
  • Slime Mold and Myxomycetes Research
  • Gene Regulatory Network Analysis

New York University
2022-2023

University College London
2019-2022

Oxford Centre for Computational Neuroscience
2022

Sainsbury Laboratory
2019-2021

Columbia University
2014-2019

Brain (Germany)
2018

Royal College of Physicians
2017

Abstract Convolutional neural networks (CNNs) were inspired by early findings in the study of biological vision. They have since become successful tools computer vision and state-of-the-art models both activity behavior on visual tasks. This review highlights what, context CNNs, it means to be a good model computational neuroscience various ways can provide insight. Specifically, covers origins CNNs methods which we validate them as It then goes elaborate what learn about understanding...

10.1162/jocn_a_01544 article EN Journal of Cognitive Neuroscience 2020-02-06

Whether current or near-term AI systems could be conscious is a topic of scientific interest and increasing public concern. This report argues for, exemplifies, rigorous empirically grounded approach to consciousness: assessing existing in detail, light our best-supported neuroscientific theories consciousness. We survey several prominent consciousness, including recurrent processing theory, global workspace higher-order theories, predictive processing, attention schema theory. From these we...

10.48550/arxiv.2308.08708 preprint EN cc-by-nc-sa arXiv (Cornell University) 2023-01-01

How does attentional modulation of neural activity enhance performance? Here we use a deep convolutional network as large-scale model the visual system to address this question. We feature similarity gain attention, in which is applied according stimulus tuning. Using variety tasks, show that modulations kind and magnitude observed experimentally lead performance changes experimentally. find that, at earlier layers, attention tuning not successfully propagate through network, has weaker...

10.7554/elife.38105 article EN cc-by eLife 2018-10-01

Complex cognitive behaviors, such as context-switching and rule-following, are thought to be supported by the prefrontal cortex (PFC). Neural activity in PFC must thus specialized specific tasks while retaining flexibility. Nonlinear "mixed" selectivity is an important neurophysiological trait for enabling complex context-dependent behaviors. Here we investigate (1) extent which exhibits computationally relevant properties, mixed selectivity, (2) how properties could arise via circuit...

10.1523/jneurosci.1222-17.2017 article EN cc-by-nc-sa Journal of Neuroscience 2017-10-06

Abstract Behavioral studies suggest that recurrence in the visual system is important for processing degraded stimuli. There are two broad anatomical forms this can take, lateral or feedback, each with different assumed functions. Here we add four kinds of recurrence—two form—to a feedforward convolutional neural network and find all capable increasing ability to classify noisy digit images. Specifically, take inspiration from findings biology by adding predictive feedback surround...

10.1101/2022.03.07.483196 preprint EN cc-by-nd bioRxiv (Cold Spring Harbor Laboratory) 2022-03-08

Abstract Selective visual attention modulates neural activity in the system complex ways and leads to enhanced performance on difficult tasks. Here, we show that a simple circuit model, stabilized supralinear network, gives unified account of wide variety effects responses. We replicate results from studies both feature spatial attention, addressing findings experimental paradigms changes firing rates correlated variability. Finally, expand this model into an architecture can perform tasks—a...

10.1101/2019.12.13.875534 preprint EN cc-by-nc-nd bioRxiv (Cold Spring Harbor Laboratory) 2019-12-13

Abstract Complex cognitive behaviors, such as context-switching and rule-following, are thought to be supported by prefrontal cortex (PFC). Neural activity in PFC must thus specialized specific tasks while retaining flexibility. Nonlinear ‘mixed’ selectivity is an important neurophysiological trait for enabling complex context-dependent behaviors. Here we investigate (1) the extent which exhibits computationally relevant properties mixed (2) how could arise via circuit mechanisms. We show...

10.1101/133025 preprint EN cc-by-nc bioRxiv (Cold Spring Harbor Laboratory) 2017-05-02

10.1038/s41562-023-01730-6 article EN Nature Human Behaviour 2023-11-20

Abstract One of the most fundamental organizational principles brain is separation excitatory (E) and inhibitory (I) neurons. In addition to their opposing effects on post-synaptic neurons, E I cells tend differ in selectivity connectivity. Although many such differences have been characterized experimentally, it not clear why they exist first place. We studied this question an artificial neural network equipped with multiple cell types. found that a deep convolutional recurrent trained...

10.1101/680439 preprint EN bioRxiv (Cold Spring Harbor Laboratory) 2019-06-25

Abstract How does attentional modulation of neural activity enhance performance? Here we use a deep convolutional network as large-scale model the visual system to address this question. We feature similarity gain attention, in which is applied according stimulus tuning. Using variety tasks, show that modulations kind and magnitude observed experimentally lead performance changes experimentally. find that, at earlier layers, attention tuning not successfully propagate through network, has...

10.1101/233338 preprint EN cc-by bioRxiv (Cold Spring Harbor Laboratory) 2017-12-13

Abstract One of the most fundamental organizational principles brain is separation excitatory (E) and inhibitory (I) neurons. In addition to their opposing effects on post-synaptic neurons, E I cells tend differ in selectivity connectivity. Although many such differences have been characterized experimentally, it not clear why they exist first place. We studied this question an artificial neural network equipped with multiple cell types. found that a deep convolutional recurrent trained...

10.32470/ccn.2019.1265-0 article EN cc-by 2022 Conference on Cognitive Computational Neuroscience 2019-01-01
Coming Soon ...