Jesse A. Livezey

ORCID: 0000-0003-0494-8758
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Neural dynamics and brain function
  • Neural Networks and Applications
  • Particle Accelerators and Free-Electron Lasers
  • Particle accelerators and beam dynamics
  • Advanced Memory and Neural Computing
  • stochastic dynamics and bifurcation
  • Blind Source Separation Techniques
  • Natural Language Processing Techniques
  • Image Retrieval and Classification Techniques
  • Particle Detector Development and Performance
  • Superconducting Materials and Applications
  • Anomaly Detection Techniques and Applications
  • EEG and Brain-Computer Interfaces
  • Science Education and Pedagogy
  • Handwritten Text Recognition Techniques
  • Gyrotron and Vacuum Electronics Research
  • Time Series Analysis and Forecasting
  • Innovative Teaching and Learning Methods
  • Experimental Learning in Engineering
  • Functional Brain Connectivity Studies
  • Visual perception and processing mechanisms
  • Digital Media Forensic Detection
  • Electron and X-Ray Spectroscopy Techniques
  • Face recognition and analysis
  • Generative Adversarial Networks and Image Synthesis

Clear Science Corporation (United States)
2025

Lawrence Berkeley National Laboratory
2017-2024

University of California, Berkeley
2016-2022

Center for Theoretical Biological Physics
2021-2022

Compass (United States)
2018

Cornell University
2014

The Theano Development Team Rami Al‐Rfou Guillaume Alain Amjad Almahairi Christof Angermueller and 95 more Dzmitry Bahdanau Nicolas Ballas Frédéric Bastien Justin Bayer Anatoly Belikov Alexander Belopolsky Yoshua Bengio Arnaud Bergeron James Bergstra Valentin Bisson Josh Bleecher Snyder Nicolas Bouchard Nicolas Boulanger-Lewandowski Xavier Bouthillier Alexandre de Brébisson Olivier Breuleux Pierre-Luc Carrier Kyunghyun Cho Jan Chorowski Paul Christiano Tim Cooijmans Marc-Alexandre Côté Myriam Côté Aaron Courville Yann Dauphin Olivier Delalleau Julien Demouth Guillaume Desjardins Sander Dieleman Laurent Dinh Mélanie Ducoffe Vincent Dumoulin Samira Ebrahimi Kahou Dumitru Erhan Ziye Fan Orhan Fırat Mathieu Germain Xavier Glorot Ian Goodfellow M. Graham Çağlar Gülçehre Philippe Hamel Iban Harlouchet Jean-Philippe Heng Balázs Hidasi Sina Honari Arjun Jain Sébastien Jean Kai Jia Mikhail Korobov Vivek Kulkarni Alex Lamb Pascal Lamblin Eric Larsen César Laurent Sean Lee Simon Lefrançois Simon Lemieux Nicholas Léonard Zhouhan Lin Jesse A. Livezey Cory Lorenz Jeremiah Lowin Qianli Ma Pierre-Antoine Manzagol Olivier Mastropietro Robert T. McGibbon Roland Memisevic Bart van Merriënboer Vincent Michalski Mehdi Mirza Alberto Orlandi Christopher Pal Razvan Pascanu Mohammad Pezeshki Colin Raffel Daniel Renshaw Matthew Rocklin Adriana Romero M. Roth Peter Sadowski John Salvatier François Savard Jan Schlüter John Schulman Gabriel Schwartz Iulian Vlad Serban Dmitriy Serdyuk Samira Shabanian Étienne Simon Sigurd Spieckermann Siva Subramanyam Jakub Sygnowski Jérémie Tanguay Gijs van Tulder

Theano is a Python library that allows to define, optimize, and evaluate mathematical expressions involving multi-dimensional arrays efficiently. Since its introduction, it has been one of the most used CPU GPU compilers - especially in machine learning community shown steady performance improvements. being actively continuously developed since 2008, multiple frameworks have built on top produce many state-of-the-art models. The present article structured as follows. Section I provides an...

10.48550/arxiv.1605.02688 preprint EN other-oa arXiv (Cornell University) 2016-01-01

Deep learning has enjoyed a great deal of success because its ability to learn useful features for tasks such as classification. But there been less exploration in the factors variation apart from classification signal. By augmenting autoencoders with simple regularization terms during training, we demonstrate that standard deep architectures can discover and explicitly represent beyond those relevant categorization. We introduce cross-covariance penalty (XCov) method disentangle like...

10.48550/arxiv.1412.6583 preprint EN other-oa arXiv (Cornell University) 2014-01-01

A fundamental challenge in neuroscience is to understand what structure the world represented spatially distributed patterns of neural activity from multiple single-trial measurements. This often accomplished by learning a simple, linear transformations between features and sensory stimuli or motor task. While successful some early processing areas, mappings are unlikely be ideal tools for elucidating nonlinear, hierarchical representations higher-order brain areas during complex tasks, such...

10.1371/journal.pcbi.1007091 article EN cc-by PLoS Computational Biology 2019-09-16

Decoding behavior, perception or cognitive state directly from neural signals is critical for brain-computer interface research and an important tool systems neuroscience. In the last decade, deep learning has become state-of-the-art method in many machine tasks ranging speech recognition to image segmentation. The success of networks other domains led a new wave applications this article, we review approaches decoding. We describe architectures used extracting useful features recording...

10.1093/bib/bbaa355 article EN Briefings in Bioinformatics 2020-11-19

Neural markers of visual function in age-related macular degeneration (AMD) allow clinicians and researchers to directly evaluate the functional changes processing which occur as a result progressive loss afferent input from macula. Unfortunately, few protocols exist that elicit such neural markers, most these are poorly adapted AMD. Here, we propose novel method embedding frequency tags into full color motion videos by periodically manipulating contrast information different spatial...

10.3389/fnhum.2025.1569282 article EN cc-by Frontiers in Human Neuroscience 2025-05-01

We have designed an introductory laboratory course that engaged first-year undergraduate students in two complementary types of iteration: (1) iterative improvement experiments through cycles modeling systems, designing experiments, analyzing data, and refining models designs; (2) self reflecting on progress, soliciting feedback, implementing changes to study habits mind. The consisted three major activities: a thermal expansion activity, which spanned the first half semester; final research...

10.1119/1.4955147 article EN American Journal of Physics 2016-08-20

10.1016/j.nima.2014.05.051 article EN Nuclear Instruments and Methods in Physics Research Section A Accelerators Spectrometers Detectors and Associated Equipment 2014-05-21

10.1016/j.nima.2014.09.069 article EN publisher-specific-oa Nuclear Instruments and Methods in Physics Research Section A Accelerators Spectrometers Detectors and Associated Equipment 2014-10-08

Simultaneous recordings from the cortex have revealed that neural activity is highly variable and some variability shared across neurons in a population. Further experimental work has demonstrated component of neuronal population's typically comparable to or larger than its private component. Meanwhile, an abundance theoretical assessed impact on population code. For example, input noise understood detrimental coding fidelity. However, other contributions variability, such as common noise,...

10.1162/neco_a_01287 article EN Neural Computation 2020-05-20

As part of the CESR-TA program at Cornell, diagnostic devices to measure and quantify electron cloud effect have been installed throughout CESR ring. One such device is Retarding Field Analyzer (RFA), which provides information on local density energy distribution. In a magnetic field free environment, RFA measurements can be directly compared with simulation study growth dynamics quantitative level. particular, photoemission secondary emission characteristics instrumented chambers...

10.1103/physrevstab.17.061001 article EN cc-by Physical Review Special Topics - Accelerators and Beams 2014-06-05

Decoding behavior, perception, or cognitive state directly from neural signals has applications in brain-computer interface research as well implications for systems neuroscience. In the last decade, deep learning become state-of-the-art method many machine tasks ranging speech recognition to image segmentation. The success of networks other domains led a new wave this article, we review approaches decoding. We describe architectures used extracting useful features recording modalities...

10.48550/arxiv.2005.09687 preprint EN other-oa arXiv (Cornell University) 2020-01-01

Linear dimensionality reduction methods are commonly used to extract low-dimensional structure from high-dimensional data. However, popular disregard temporal structure, rendering them prone extracting noise rather than meaningful dynamics when applied time series At the same time, many successful unsupervised learning for temporal, sequential and spatial data features which predictive of their surrounding context. Combining these approaches, we introduce Dynamical Components Analysis (DCA),...

10.48550/arxiv.1905.09944 preprint EN other-oa arXiv (Cornell University) 2019-01-01

Finding overcomplete latent representations of data has applications in analysis, signal processing, machine learning, theoretical neuroscience and many other fields. In an representation, the number features exceeds dimensionality, which is useful when undersampled by measurements (compressed sensing, information bottlenecks neural systems) or composed from multiple complete sets linear features, each spanning space. Independent Components Analysis (ICA) a technique for learning sparse...

10.48550/arxiv.1606.03474 preprint EN other-oa arXiv (Cornell University) 2016-01-01

Following the advent of a post-Moore's law field computation, novel architectures continue to emerge. With composite, multi-million connection neuromorphic chips like IBM's TrueNorth, neural engineering has now become feasible technology in this computing paradigm. High Energy Physics experiments are continuously exploring new methods computation and data handling, including neuromorphic, support growing challenges be prepared for future commodity trends. This work details first instance...

10.1088/1742-6596/898/4/042021 article EN Journal of Physics Conference Series 2017-10-01
Coming Soon ...