Simulating lossy Gaussian boson sampling with matrix product operators
FOS: Computer and information sciences
Quantum Physics
Computer Science - Computational Complexity
0103 physical sciences
FOS: Physical sciences
Computational Complexity (cs.CC)
Computational Physics (physics.comp-ph)
Quantum Physics (quant-ph)
Physics - Computational Physics
01 natural sciences
DOI:
10.48550/arxiv.2301.12814
Publication Date:
2023-11-13
AUTHORS (5)
ABSTRACT
16 pages, 11 figures. To appear in PRA. This article supersedes arXiv:2303.11409<br/>Gaussian boson sampling, a computational model that is widely believed to admit quantum supremacy, has already been experimentally demonstrated and is claimed to surpass the classical simulation capabilities of even the most powerful supercomputers today. However, whether the current approach limited by photon loss and noise in such experiments prescribes a scalable path to quantum advantage is an open question. To understand the effect of photon loss on the scalability of Gaussian boson sampling, we analytically derive the asymptotic operator entanglement entropy scaling, which relates to the simulation complexity. As a result, we observe that efficient tensor network simulations are likely possible under the $N_\text{out}\propto\sqrt{N}$ scaling of the number of surviving photons orange$N_\text{out}$ in the number of input photons $N$. We numerically verify this result using a tensor network algorithm with $U(1)$ symmetry, and overcome previous challenges due to the large local Hilbert space dimensions in Gaussian boson sampling with hardware acceleration. Additionally, we observe that increasing the photon number through larger squeezing does not increase the entanglement entropy significantly. Finally, we numerically find the bond dimension necessary for fixed accuracy simulations, providing more direct evidence for the complexity of tensor networks.<br/>
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....