Abstract

BosonSampling and Random Circuit Sampling are important both as a theoretical tool for separating quantum and classical computation, and as an experimental means of demonstrating quantum speedups. Prior works have shown that average-case hardness of sampling follows from certain unproven conjectures about the hardness of computing output probabilities, such as the Permanent-of-Gaussians Conjecture (PGC), which states that $e^{-n\log{n}-n-O(\log n)}$ additive-error estimates to the output probability of most random BosonSampling experiments are \textsf{\#P}-hard. Prior works have only shown weaker average-case hardness results that do not imply sampling hardness. Proving these conjectures has become a central question in quantum complexity.

In this work, we show that $e^{-n\log n -n - O(n^\delta)}$ additive-error estimates to output probabilities of most random BosonSampling experiments are \textsf{\#P}-hard for any $\delta>0$, exponentially improving on prior work. In the process, we circumvent all known barrier results for proving PGC. The remaining hurdle to prove PGC is now ``merely'' to show that the $O(n^\delta)$ in the exponent can be improved to $O(\log n).$ We also obtain an analogous result for Random Circuit Sampling.

We then show, for the first time, a hardness of average-case classical \textit{sampling} result for BosonSampling, under an anticoncentration conjecture. Specifically, we prove the impossibility of multiplicative-error sampling from random BosonSampling experiments with probability $1-2^{-\tilde O(N^{1/3})}$ for input size $N$, unless the Polynomial Hierarchy collapses. This exponentially improves upon the state-of-the-art. To do this, we introduce new proof techniques which tolerate exponential loss in the worst-to-average-case reduction. This opens the possibility to show hardness of average-case sampling without ever proving PGC.

Attachment

Video Recording