- Neural dynamics and brain function
- Neural Networks and Applications
- Particle Accelerators and Free-Electron Lasers
- Particle accelerators and beam dynamics
- Advanced Memory and Neural Computing
- stochastic dynamics and bifurcation
- Blind Source Separation Techniques
- Natural Language Processing Techniques
- Image Retrieval and Classification Techniques
- Particle Detector Development and Performance
- Superconducting Materials and Applications
- Anomaly Detection Techniques and Applications
- EEG and Brain-Computer Interfaces
- Science Education and Pedagogy
- Handwritten Text Recognition Techniques
- Gyrotron and Vacuum Electronics Research
- Time Series Analysis and Forecasting
- Innovative Teaching and Learning Methods
- Experimental Learning in Engineering
- Functional Brain Connectivity Studies
- Visual perception and processing mechanisms
- Digital Media Forensic Detection
- Electron and X-Ray Spectroscopy Techniques
- Face recognition and analysis
- Generative Adversarial Networks and Image Synthesis
Clear Science Corporation (United States)
2025
Lawrence Berkeley National Laboratory
2017-2024
University of California, Berkeley
2016-2022
Center for Theoretical Biological Physics
2021-2022
Compass (United States)
2018
Cornell University
2014
Theano is a Python library that allows to define, optimize, and evaluate mathematical expressions involving multi-dimensional arrays efficiently. Since its introduction, it has been one of the most used CPU GPU compilers - especially in machine learning community shown steady performance improvements. being actively continuously developed since 2008, multiple frameworks have built on top produce many state-of-the-art models. The present article structured as follows. Section I provides an...
Deep learning has enjoyed a great deal of success because its ability to learn useful features for tasks such as classification. But there been less exploration in the factors variation apart from classification signal. By augmenting autoencoders with simple regularization terms during training, we demonstrate that standard deep architectures can discover and explicitly represent beyond those relevant categorization. We introduce cross-covariance penalty (XCov) method disentangle like...
A fundamental challenge in neuroscience is to understand what structure the world represented spatially distributed patterns of neural activity from multiple single-trial measurements. This often accomplished by learning a simple, linear transformations between features and sensory stimuli or motor task. While successful some early processing areas, mappings are unlikely be ideal tools for elucidating nonlinear, hierarchical representations higher-order brain areas during complex tasks, such...
Decoding behavior, perception or cognitive state directly from neural signals is critical for brain-computer interface research and an important tool systems neuroscience. In the last decade, deep learning has become state-of-the-art method in many machine tasks ranging speech recognition to image segmentation. The success of networks other domains led a new wave applications this article, we review approaches decoding. We describe architectures used extracting useful features recording...
Neural markers of visual function in age-related macular degeneration (AMD) allow clinicians and researchers to directly evaluate the functional changes processing which occur as a result progressive loss afferent input from macula. Unfortunately, few protocols exist that elicit such neural markers, most these are poorly adapted AMD. Here, we propose novel method embedding frequency tags into full color motion videos by periodically manipulating contrast information different spatial...
We have designed an introductory laboratory course that engaged first-year undergraduate students in two complementary types of iteration: (1) iterative improvement experiments through cycles modeling systems, designing experiments, analyzing data, and refining models designs; (2) self reflecting on progress, soliciting feedback, implementing changes to study habits mind. The consisted three major activities: a thermal expansion activity, which spanned the first half semester; final research...
Simultaneous recordings from the cortex have revealed that neural activity is highly variable and some variability shared across neurons in a population. Further experimental work has demonstrated component of neuronal population's typically comparable to or larger than its private component. Meanwhile, an abundance theoretical assessed impact on population code. For example, input noise understood detrimental coding fidelity. However, other contributions variability, such as common noise,...
As part of the CESR-TA program at Cornell, diagnostic devices to measure and quantify electron cloud effect have been installed throughout CESR ring. One such device is Retarding Field Analyzer (RFA), which provides information on local density energy distribution. In a magnetic field free environment, RFA measurements can be directly compared with simulation study growth dynamics quantitative level. particular, photoemission secondary emission characteristics instrumented chambers...
Decoding behavior, perception, or cognitive state directly from neural signals has applications in brain-computer interface research as well implications for systems neuroscience. In the last decade, deep learning become state-of-the-art method many machine tasks ranging speech recognition to image segmentation. The success of networks other domains led a new wave this article, we review approaches decoding. We describe architectures used extracting useful features recording modalities...
Linear dimensionality reduction methods are commonly used to extract low-dimensional structure from high-dimensional data. However, popular disregard temporal structure, rendering them prone extracting noise rather than meaningful dynamics when applied time series At the same time, many successful unsupervised learning for temporal, sequential and spatial data features which predictive of their surrounding context. Combining these approaches, we introduce Dynamical Components Analysis (DCA),...
Finding overcomplete latent representations of data has applications in analysis, signal processing, machine learning, theoretical neuroscience and many other fields. In an representation, the number features exceeds dimensionality, which is useful when undersampled by measurements (compressed sensing, information bottlenecks neural systems) or composed from multiple complete sets linear features, each spanning space. Independent Components Analysis (ICA) a technique for learning sparse...
Following the advent of a post-Moore's law field computation, novel architectures continue to emerge. With composite, multi-million connection neuromorphic chips like IBM's TrueNorth, neural engineering has now become feasible technology in this computing paradigm. High Energy Physics experiments are continuously exploring new methods computation and data handling, including neuromorphic, support growing challenges be prepared for future commodity trends. This work details first instance...