Subtleties in the trainability of quantum machine learning models
Quantum Machine Learning
Speedup
DOI:
10.1007/s42484-023-00103-6
Publication Date:
2023-05-15T14:39:49Z
AUTHORS (5)
ABSTRACT
Abstract A new paradigm for data science has emerged, with quantum data, models, and computational devices. This field, called machine learning (QML), aims to achieve a speedup over traditional analysis. However, its success usually hinges on efficiently training the parameters in neural networks, field of QML is still lacking theoretical scaling results their trainability. Some trainability have been proven closely related variational algorithms (VQAs). While both fields involve parametrized circuit, there are crucial differences that make one setting not readily applicable other. In this work, we bridge two frameworks show gradient VQAs can also be applied study models. Our indicate features deemed detrimental VQA lead issues such as barren plateaus QML. Consequently, our work implications several proposals literature. addition, provide numerical evidence models exhibit further present VQAs, arising from use dataset. We refer these dataset-induced plateaus. These most relevant when dealing classical here choice embedding scheme (i.e., map between states) greatly affect scaling.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (79)
CITATIONS (31)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....