Jakob Jordan

ORCID: 0000-0003-3438-5001
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Neural dynamics and brain function
  • Advanced Memory and Neural Computing
  • Neural Networks and Applications
  • Neuroscience and Neural Engineering
  • Sleep and Wakefulness Research
  • Neural Networks and Reservoir Computing
  • EEG and Brain-Computer Interfaces
  • stochastic dynamics and bifurcation
  • Memory and Neural Mechanisms
  • Neuroscience and Music Perception
  • Neuroscience and Neuropharmacology Research
  • Evolutionary Algorithms and Applications
  • Cell Image Analysis Techniques
  • Aesthetic Perception and Analysis
  • Functional Brain Connectivity Studies
  • Reinforcement Learning in Robotics
  • Mind wandering and attention
  • Advanced Electron Microscopy Techniques and Applications
  • Advanced X-ray Imaging Techniques
  • Generative Adversarial Networks and Image Synthesis
  • Slime Mold and Myxomycetes Research
  • Digital Holography and Microscopy
  • nanoparticles nucleation surface interactions
  • Banking Systems and Strategies
  • Neurobiology and Insect Physiology Research

University of Bern
2017-2024

Yale University
2023-2024

University of Cambridge
2021

Jülich Aachen Research Alliance
2015-2020

Forschungszentrum Jülich
2016-2020

State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks up 10 % human cortex at a resolution individual neurons synapses. Due an upper limit on number incoming connections single neuron, connectivity becomes extremely sparse this scale. To manage computational costs, simulation ultimately targeting brain needs fully exploit sparsity. Here we present two-tier connection...

10.3389/fninf.2018.00002 article EN cc-by Frontiers in Neuroinformatics 2018-02-16

Humans and other animals learn to extract general concepts from sensory experience without extensive teaching. This ability is thought be facilitated by offline states like sleep where previous experiences are systemically replayed. However, the characteristic creative nature of dreams suggests that learning semantic representations may go beyond merely replaying experiences. We support this hypothesis implementing a cortical architecture inspired generative adversarial networks (GANs)....

10.7554/elife.76384 article EN cc-by eLife 2022-04-06

Dendrites shape information flow in neurons. Yet, there is little consensus on the level of spatial complexity at which they operate. Through carefully chosen parameter fits, solvable least-squares sense, we obtain accurate reduced compartmental models any complexity. We show that (back-propagating) action potentials, Ca2+ spikes, and N-methyl-D-aspartate spikes can all be reproduced with few compartments. also investigate whether afferent connectivity motifs admit simplification by ablating...

10.7554/elife.60936 article EN cc-by eLife 2021-01-26

Continuous adaptation allows survival in an ever-changing world. Adjustments the synaptic coupling strength between neurons are essential for this capability, setting us apart from simpler, hard-wired organisms. How these changes can be mathematically described at phenomenological level, as so-called 'plasticity rules', is both understanding biological information processing and developing cognitively performant artificial systems. We suggest automated approach discovering biophysically...

10.7554/elife.66273 article EN cc-by eLife 2021-10-28

A bstract One of the most fundamental laws physics is principle least action. Motivated by its predictive power, we introduce a neuronal least-action for cortical processing sensory streams to produce appropriate behavioural outputs in real time. The postulates that voltage dynamics pyramidal neurons prospectively minimizes local somato-dendritic mismatch error within individual neurons. For output neurons, implies minimizing an instantaneous error. deep network it prospective firing...

10.1101/2023.03.25.534198 preprint EN cc-by-nc-nd bioRxiv (Cold Spring Harbor Laboratory) 2023-03-25

While sensory representations in the brain depend on context, it remains unclear how such modulations are implemented at biophysical level, and processing layers further hierarchy can extract useful features for each possible contextual state. Here, we demonstrate that dendritic N-Methyl-D-Aspartate spikes can, within physiological constraints, implement modulation of feedforward processing. Such neuron-specific exploit prior knowledge, encoded stable weights, to achieve transfer learning...

10.1073/pnas.2300558120 article EN cc-by Proceedings of the National Academy of Sciences 2023-07-31

Semantic representations in higher sensory cortices form the basis for robust, yet flexible behavior. These are acquired over course of development an unsupervised fashion and continuously maintained organism's lifespan. Predictive processing theories propose that these emerge from predicting or reconstructing inputs. However, brains known to generate virtual experiences, such as during imagination dreaming, go beyond previously experienced Here, we suggest experiences may be just relevant...

10.1016/j.neubiorev.2023.105508 article EN cc-by Neuroscience & Biobehavioral Reviews 2023-12-12

High-level brain function such as memory, classification or reasoning can be realized by means of recurrent networks simplified model neurons. Analog neuromorphic hardware constitutes a fast and energy efficient substrate for the implementation neural computing architectures in technical applications neuroscientific research. The functional performance is often critically dependent on level correlations activity. In finite networks, are typically inevitable due to shared presynaptic input....

10.1103/physrevx.6.021023 article EN cc-by Physical Review X 2016-05-18

Abstract Neuronal network models of high-level brain functions such as memory recall and reasoning often rely on the presence some form noise. The majority these assumes that each neuron in functional is equipped with its own private source randomness, uncorrelated external In vivo , synaptic background input has been suggested to serve main noise biological neuronal networks. However, finiteness number sources constitutes a challenge this idea. Here, we show shared-noise correlations...

10.1038/s41598-019-54137-7 article EN cc-by Scientific Reports 2019-12-04

Modern computational neuroscience strives to develop complex network models explain dynamics and function of brains in health disease. This process goes hand with advancements the theory neuronal networks increasing availability detailed anatomical data on brain connectivity. Large-scale that study interactions between multiple areas intricate connectivity investigate phenomena long time scales such as system-level learning require progress simulation speed. The corresponding development...

10.3389/fninf.2022.837549 article EN cc-by Frontiers in Neuroinformatics 2022-05-11

Simulation is a third pillar next to experiment and theory in the study of complex dynamic systems such as biological neural networks. Contemporary brain-scale networks correspond directed random graphs few million nodes, each with an in-degree out-degree several thousands edges, where nodes edges fundamental units, neurons synapses, respectively. The activity neuronal also sparse. Each neuron occasionally transmits brief signal, called spike, via its outgoing synapses corresponding target...

10.1016/j.parco.2022.102952 article EN cc-by Parallel Computing 2022-07-13

Generic simulation code for spiking neuronal networks spends the major part of time in phase where spikes have arrived at a compute node and need to be delivered their target neurons. These were emitted over last interval between communication steps by source neurons distributed across many nodes are inherently irregular unsorted with respect targets. For finding those targets, dispatched three-dimensional data structure decisions on thread synapse type made way. With growing network size,...

10.3389/fninf.2021.785068 article EN cc-by Frontiers in Neuroinformatics 2022-03-01

The importance of sleep for healthy brain function is widely acknowledged. However, it re- mains mysterious how the sleeping brain, disconnected from outside world and plunged into fantastic experiences dreams, actively learning. In this perspective article, we review a computational approach inspired by modern artificial intelligence that suggests role dreams occurring during rapid-eye-movement (REM) sleep. REM are characterized an adver- sarial process between feedforward feedback pathways...

10.20944/preprints202403.0684.v1 preprint EN 2024-03-12

Effective learning in neuronal networks requires the adaptation of individual synapses given their relative contribution to solving a task. However, physical systems -- whether biological or artificial are constrained by spatio-temporal locality. How such can perform efficient credit assignment, remains, large extent, an open question. In Machine Learning, answer is almost universally error backpropagation algorithm, through both space (BP) and time (BPTT). BP(TT) well-known rely on...

10.48550/arxiv.2403.16933 preprint EN arXiv (Cornell University) 2024-03-25

The importance of sleep for healthy brain function is widely acknowledged. However, it remains unclear how the internal generation dreams might facilitate cognitive processes. In this perspective we review a computational approach inspired by artificial intelligence that proposes framework occurring during rapid-eye-movement (REM) can contribute to learning and creativity. framework, REM are characterized an adversarial process that, against dream reality, tell discriminator network classify...

10.20944/preprints202403.0684.v2 preprint EN 2024-05-10

The importance of sleep for healthy brain function is widely acknowledged. However, it remains unclear how the internal generation dreams might facilitate cognitive processes. In this perspective, we review a computational approach inspired by artificial intelligence that proposes framework occurring during rapid-eye-movement (REM) can contribute to learning and creativity. framework, REM are characterized an adversarial process that, against dream reality, tells discriminator network...

10.3390/ctn8020021 article EN cc-by Clinical and Translational Neuroscience 2024-05-31

A fundamental function of cortical circuits is the integration information from different sources to form a reliable basis for behavior. While animals behave as if they optimally integrate according Bayesian probability theory, implementation required computations in biological substrate remains unclear. We propose novel, view on dynamics conductance-based neurons and synapses which suggests that are naturally equipped perform integration. In our approach apical dendrites represent prior...

10.1371/journal.pcbi.1012047 article EN cc-by PLoS Computational Biology 2024-06-12

Investigating the dynamics and function of large-scale spiking neuronal networks with realistic numbers synapses is made possible today by state-of-the-art simulation code that scales to largest contemporary supercomputers. However, simulations involve electrical interactions, also called gap junctions, besides chemical scale only poorly due a communication scheme collects global data on each compute node. In comparison synapses, junctions are far less abundant. To improve scalability we...

10.3389/fninf.2020.00012 article EN cc-by Frontiers in Neuroinformatics 2020-05-05

Neural network simulation is an important tool for generating and evaluating hypotheses on the structure, dynamics, function of neural circuits. For scientific questions addressing organisms operating autonomously in their environments, particular where learning involved, it crucial to be able operate such simulations a closed-loop fashion. In set-up, agent continuously receives sensory stimuli from environment provides motor signals that manipulate or move within it. So far, most studies...

10.3389/fncom.2019.00046 article EN cc-by Frontiers in Computational Neuroscience 2019-08-02

The response time of physical computational elements is finite, and neurons are no exception. In hierarchical models cortical networks each layer thus introduces a lag. This inherent property dynamical systems results in delayed processing stimuli causes timing mismatch between network output instructive signals, afflicting not only inference, but also learning. We introduce Latent Equilibrium, new framework for inference learning slow components which avoids these issues by harnessing the...

10.48550/arxiv.2110.14549 preprint EN cc-by-nc-sa arXiv (Cornell University) 2021-01-01

Continuous adaptation allows survival in an ever-changing world. Adjustments the synaptic coupling strength between neurons are essential for this capability, setting us apart from simpler, hard-wired organisms. How these changes can be mathematically described at phenomenological level, as so called "plasticity rules", is both understanding biological information processing and developing cognitively performant artificial systems. We suggest automated approach discovering biophysically...

10.48550/arxiv.2005.14149 preprint EN other-oa arXiv (Cornell University) 2020-01-01
Coming Soon ...