- Advanced Memory and Neural Computing
- Neural dynamics and brain function
- Ferroelectric and Negative Capacitance Devices
- Advanced Graph Neural Networks
- Neural Networks and Applications
- Neural Networks and Reservoir Computing
- Neuroscience and Neural Engineering
- EEG and Brain-Computer Interfaces
- Modular Robots and Swarm Intelligence
- Complex Network Analysis Techniques
- Network Security and Intrusion Detection
- Topic Modeling
- Topology Optimization in Engineering
- Robotics and Automated Systems
- Advanced Materials and Mechanics
- Spacecraft Design and Technology
- Machine Learning in Materials Science
- CCD and CMOS Imaging Sensors
- Structural Analysis and Optimization
- Explainable Artificial Intelligence (XAI)
- Space Satellite Systems and Control
- Photoreceptor and optogenetics research
- Quantum Computing Algorithms and Architecture
- Optimization and Search Problems
- Manufacturing Process and Optimization
University of Vienna
2025
European Space Research and Technology Centre
2021-2025
Heidelberg University
2017-2024
European Space Agency
2023-2024
Kirchhoff (Germany)
2017-2024
Institute for Physics
2019-2024
Advanced Scientific Concepts (United States)
2024
University of Bern
2019-2024
Siemens (Germany)
2019-2021
Selma University
2000
We present first experimental results on the novel BrainScaleS-2 neuromorphic architecture based an analog neuro-synaptic core and augmented by embedded microprocessors for complex plasticity experiment control. The high acceleration factor of 1000 compared to biological dynamics enables execution computationally expensive tasks, allowing fast emulation long-duration experiments or rapid iteration over many consecutive trials. flexibility our is demonstrated in a suite five distinct...
We present a novel software feature for the BrainScaleS-2 accelerated neuromorphic platform that facilitates partitioned emulation of large-scale spiking neural networks. This approach is well suited deep networks and allows sequential model on undersized resources if largest recurrent subnetwork required neuron fan-in fit substrate. demonstrate training two network models—using MNIST EuroSAT datasets—that exceed physical size constraints single-chip system. The ability to emulate train...
The massively parallel nature of biological information processing plays an important role due to its superiority in comparison human-engineered computing devices. In particular, it may hold the key overcoming von Neumann bottleneck that limits contemporary computer architectures. Physical-model neuromorphic devices seek replicate not only this inherent parallelism, but also aspects microscopic dynamics analog circuits emulating neurons and synapses. However, these machines require network...
An increasing body of evidence suggests that the trial-to-trial variability spiking activity in brain is not mere noise, but rather reflection a sampling-based encoding scheme for probabilistic computing. Since precise statistical properties neural are important this context, many models assume an ad-hoc source well-behaved, explicit either on input or output side single neuron dynamics, most often assuming independent Poisson process case. However, these assumptions somewhat problematic:...
A bstract One of the most fundamental laws physics is principle least action. Motivated by its predictive power, we introduce a neuronal least-action for cortical processing sensory streams to produce appropriate behavioural outputs in real time. The postulates that voltage dynamics pyramidal neurons prospectively minimizes local somato-dendritic mismatch error within individual neurons. For output neurons, implies minimizing an instantaneous error. deep network it prospective firing...
Machine learning techniques are gaining attention in the context of intrusion detection due to increasing amounts data generated by monitoring tools, as well sophistication displayed attackers hiding their activity. However, existing methods often exhibit important limitations terms quantity and relevance alerts. Recently, knowledge graphs finding application cybersecurity domain, showing potential alleviate some these drawbacks thanks ability seamlessly integrate from multiple domains using...
Architected materials possessing physico-chemical properties adaptable to disparate environmental conditions embody a disruptive new domain of science. Fueled by advances in digital design and fabrication, shaped into lattice topologies enable degree property customization not afforded bulk materials. A promising venue for inspiration toward their is the irregular micro-architectures nature. However, immense variability unlocked such irregularity challenging probe analytically. Here, we...
Despite the recent success of reconciling spike-based coding with error backpropagation algorithm, spiking neural networks are still mostly applied to tasks stemming from sensory processing, operating on traditional data structures like visual or auditory data. A rich representation that finds wide application in industry and research is so-called knowledge graph - a graph-based structure where entities depicted as nodes relations between them edges. Complex systems molecules, social...
A steadily increasing body of evidence suggests that the brain performs probabilistic inference to interpret and respond sensory input trial-to-trial variability in neural activity plays an important role. The sampling hypothesis interprets stochastic as from underlying probability distribution has been shown be compatible with biologically observed firing patterns. In many studies, uncorrelated noise is used a source stochasticity, discounting fact cortical neurons may share significant...
Knowledge graphs are an expressive and widely used data structure due to their ability integrate from different domains in a sensible machine-readable way. Thus, they can be model variety of systems such as molecules social networks. However, it still remains open question how symbolic reasoning could realized spiking and, therefore, neural networks applied graph data. Here, we extend previous work on spike-based algorithms by demonstrating multi-relational information encoded using neurons,...
Machine learning on graph-structured data has recently become a major topic in industry and research, finding many exciting applications such as recommender systems automated theorem proving. We propose an energy-based graph embedding algorithm to characterize industrial automation systems, integrating knowledge from different domains like automation, communications cybersecurity. By combining multiple domains, the learned model is capable of making context-aware predictions regarding novel...
One of the most fundamental laws physics is principle least action. Motivated by its predictive power, we introduce a neural least-action that apply to motor control. The central notion somato-dendritic mismatch error within individual neurons. postulates errors across all neurons in cortical network are minimized voltage dynamics. Ongoing synaptic plasticity reduces each neuron and performs gradient descent on output cost real time. neuronal activity prospective, ensuring dendritic deep...
Machine learning techniques are gaining attention in the context of intrusion detection due to increasing amounts data generated by monitoring tools, as well sophistication displayed attackers hiding their activity. However, existing methods often exhibit important limitations terms quantity and relevance alerts. Recently, knowledge graphs finding application cybersecurity domain, showing potential alleviate some these drawbacks thanks ability seamlessly integrate from multiple domains using...
For a biological agent operating under environmental pressure, energy consumption and reaction times are of critical importance. Similarly, engineered systems optimized for short time-to-solution low energy-to-solution characteristics. At the level neuronal implementation, this implies achieving desired results with as few early spikes possible. With time-to-first-spike coding both these goals inherently emerging features learning. Here, we describe rigorous derivation learning rule such...
Computing latent representations for graph-structured data is an ubiquitous learning task in many industrial and academic applications ranging from molecule synthetization to social network analysis recommender systems. Knowledge graphs are among the most popular widely used related Semantic Web. Next structuring factual knowledge a machine-readable format, serve as backbone of artificial intelligence allow ingestion context information into various algorithms. Graph neural networks attempt...
Machine learning on graph-structured data has recently become a major topic in industry and research, finding many exciting applications such as recommender systems automated theorem proving. We propose an energy-based graph embedding algorithm to characterize industrial automation systems, integrating knowledge from different domains like automation, communications cybersecurity. By combining multiple domains, the learned model is capable of making context-aware predictions regarding novel...
We present a novel software feature for the BrainScaleS-2 accelerated neuromorphic platform that facilitates emulation of partitioned large-scale spiking neural networks. This approach is well suited many deep networks, where constraint largest recurrent subnetwork fitting on substrate or limited fan-in neurons often not limitation in practice. demonstrate training two network models, using MNIST and EuroSAT datasets, exceed physical size constraints single-chip system. The ability to...