- Quantum Computing Algorithms and Architecture
- Quantum Information and Cryptography
- Quantum-Dot Cellular Automata
- Neural Networks and Reservoir Computing
- Machine Learning and Algorithms
- Quantum many-body systems
- Neural Networks and Applications
- Machine Learning in Materials Science
- Quantum and electron transport phenomena
- Machine Learning and ELM
- Spectral Theory in Mathematical Physics
- Stochastic Gradient Optimization Techniques
- Statistical Mechanics and Entropy
Freie Universität Berlin
2021-2025
Fraunhofer Institute for Telecommunications, Heinrich Hertz Institute
2021-2025
Universitat de Barcelona
2020
A single qubit provides sufficient computational capabilities to construct a universal quantum classifier when assisted with classical subroutine. This fact may be surprising since only offers simple superposition of two states and single-qubit gates make rotation in the Bloch sphere. The key ingredient circumvent these limitations is allow for multiple data re-uploading. circuit can then organized as series re-uploading processing units. Furthermore, both measurements accommodate dimensions...
A blueprint for exploiting symmetries in the construction of variational quantum learning models that can result improved generalization performance is developed and demonstrated on practical problems.
Kernel methods are a cornerstone of classical machine learning. The idea using quantum computers to compute kernels has recently attracted attention. Quantum embedding (QEKs), constructed by data into the Hilbert space computer, particular kernel technique that is particularly suitable for noisy intermediate-scale devices. Unfortunately, face three major problems: Constructing matrix quadratic computational complexity in number training samples, choosing right function nontrivial, and...
A large body of recent work has begun to explore the potential parametrized quantum circuits (PQCs) as machine learning models, within framework hybrid quantum-classical optimization. In particular, theoretical guarantees on out-of-sample performance such in terms generalization bounds, have emerged. However, none these bounds depend explicitly how classical input data is encoded into PQC. We derive for PQC-based models that strategy used data-encoding. These imply trained unseen data....
The double descent phenomenon challenges traditional statistical learning theory by revealing scenarios where larger models do not necessarily lead to reduced performance on unseen data. While this counterintuitive behavior has been observed in a variety of classical machine models, particularly modern neural network architectures, it remains elusive within the context quantum learning. In work, we analytically demonstrate that can exhibit drawing insights from linear regression and random...
Quantum machine learning is arguably one of the most explored applications near-term quantum devices. Much focus has been put on notions variational where <mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML"><mml:mrow class="MJX-TeXAtom-ORD"><mml:mtext class="MJX-tex-mathit" mathvariant="italic">parameterized circuits</mml:mtext></mml:mrow></mml:math> (PQCs) are used as models. These PQC models have a rich structure which suggests that they might be amenable to efficient dequantization...
Abstract One of the most natural connections between quantum and classical machine learning has been established in context kernel methods. Kernel methods rely on kernels, which are inner products feature vectors living large spaces. Quantum kernels typically evaluated by explicitly constructing states then taking their product, here called embedding kernels. Since usually without using explicitly, we wonder how expressive are. In this work, raise fundamental question: can all be expressed...
The quest for successful variational quantum machine learning (QML) relies on the design of suitable parametrized circuits (PQCs), as analogues to neural networks in classical learning. Successful QML models must fulfill properties trainability and non-dequantization, among others. Recent works have highlighted an intricate interplay between dequantization such models, which is still unresolved. In this work we contribute debate from perspective learning, proving a number results...
Variational quantum machine learning is an extensively studied application of near-term computers. The success variational models crucially depends on finding a suitable parametrization the model that encodes inductive bias relevant to task. However, precious little known about guiding principles for construction parametrizations. In this work, we holistically explore when and how symmetries problem can be exploited construct with outcomes invariant under symmetry Building tools from...
A common trait of many machine learning models is that it often difficult to understand and explain what caused the model produce given output. While explainability neural networks has been an active field research in last years, comparably little known for quantum models. Despite a few recent works analyzing some specific aspects explainability, as now there no clear big picture perspective can be expected from terms explainability. In this work, we address issue by identifying promising...
Quantum machine learning is arguably one of the most explored applications near-term quantum devices. Much focus has been put on notions variational where parameterized circuits (PQCs) are used as models. These PQC models have a rich structure which suggests that they might be amenable to efficient dequantization via random Fourier features (RFF). In this work, we establish necessary and sufficient conditions under RFF does indeed provide an for regression. We build these insights make...
Kernel methods are a cornerstone of classical machine learning. The idea using quantum computers to compute kernels has recently attracted attention. Quantum embedding (QEKs) constructed by data into the Hilbert space computer particular kernel technique that allows gather insights learning problems and particularly suitable for noisy intermediate-scale devices. In this work, we first provide an accessible introduction then analyze practical issues arising when realizing them on near-term...
Quantum machine learning models have shown successful generalization performance even when trained with few data. In this work, through systematic randomization experiments, we show that traditional approaches to understanding fail explain the behavior of such quantum models. Our experiments reveal state-of-the-art neural networks accurately fit random states and labeling training This ability memorize data defies current notions small error, problematizing build on complexity measures as VC...
One of the most natural connections between quantum and classical machine learning has been established in context kernel methods. Kernel methods rely on kernels, which are inner products feature vectors living large spaces. Quantum kernels typically evaluated by explicitly constructing states then taking their product, here called embedding kernels. Since usually without using explicitly, we wonder how expressive are. In this work, raise fundamental question: can all be expressed as product...