Boson Sampling Scott Aaronson MIT April 18 2014
Boson. Sampling Scott Aaronson (MIT) April 18, 2014
The Extended Church. Turing Thesis (ECT) Everything feasibly computable in the physical world is feasibly computable by a (probabilistic) Turing machine Shor’s Theorem: QUANTUM SIMULATION has no efficient classical algorithm, unless FACTORING does
So the ECT is false … what more evidence could anyone want? Building a QC able to factor large numbers is damn hard! After 16 years, no fundamental obstacle has been found, but who knows? Can’t we “meet the physicists halfway, ” and show computational hardness for quantum systems closer to what they actually work with now? FACTORING might be have a fast classical algorithm! At any rate, it’s an extremely “special” problem Wouldn’t it be great to show that if, quantum computers can be simulated classically, then (say) P=NP?
Boson. Sampling (A. -Arkhipov 2011) A rudimentary type of quantum computing, involving only non-interacting photons Classical counterpart: Galton’s Board Replacing the balls by photons leads to famously counterintuitive phenomena, like the Hong-Ou-Mandel dip
In general, we consider a network of beamsplitters, with n input “modes” (locations) and m>>n output modes n identical photons enter, one per input mode Assume for simplicity they all leave in different modes—there are possibilities The beamsplitter network defines a column-orthonormal matrix A Cm n, such that where is the matrix permanent n n submatrix of A corresponding to S
Example For Hong-Ou-Mandel experiment, In general, an n n complex permanent is a sum of n! terms, almost all of which cancel How hard is it to estimate the “tiny residue” left over? Answer: #P-complete, even for constant-factor approx (Contrast with nonnegative permanents!)
So, Can We Use Quantum Optics to Solve a #P-Complete Problem? That sounds way too good to be true… Explanation: If X is sub-unitary, then |Per(X)|2 will usually be exponentially small. So to get a reasonable estimate of |Per(X)|2 for a given X, we’d generally need to repeat the optical experiment exponentially many times
Better idea: Given A Cm n as input, let Boson. Sampling be the problem of merely sampling from the same distribution DA that the beamsplitter network samples from—the one defined by Pr[S]=|Per(AS)|2 Theorem (A. -Arkhipov 2011): Suppose Boson. Sampling is #P=BPPNP solvable in classical polynomial time. Then P Upshot: Compared to (say) Shor’s factoring algorithm, we get different/stronger evidence Better Theorem: Suppose we can sample DA eventhat a weaker system can dopolynomial somethingtime. classically approximately in classical Thenhard in BPPNP, it’s possible to estimate Per(X), with high probability over a Gaussian random matrix We conjecture that the above problem is already #P-complete. If it is, then even a fast classical algorithm for approximate Boson. Sampling would have the consequence that P#P=BPPNP
Related Work Valiant 2001, Terhal-Di. Vincenzo 2002, “folklore”: A QC built of noninteracting fermions can be efficiently simulated by a classical computer Knill, Laflamme, Milburn 2001: Noninteracting bosons plus adaptive measurements yield universal QC Jerrum-Sinclair-Vigoda 2001: Fast classical randomized algorithm to approximate Per(A) for nonnegative A Gurvits 2002: O(n 2/ 2) classical randomized algorithm to approximate an n-photon amplitude to ± additive error (also, to compute k-mode marginal distribution in n. O(k) time)
Boson. Sampling Experiments In 2012, groups in Brisbane, Oxford, Rome, and Vienna reported the first 3 -photon Boson. Sampling experiments, confirming that the amplitudes were given by 3 x 3 permanents # of experiments > # of photons!
Obvious Challenges for Scaling Up: - Reliable single-photon sources (optical multiplexing? ) - Minimizing losses - Getting high probability of n-photon coincidence Goal (in our view): Scale to 10 -30 photons Don’t want to scale much beyond that—both because (1) you probably can’t without fault-tolerance, and (2) a classical computer probably couldn’t even verify the results!
Scattershot Boson. Sampling Exciting new idea, proposed by Steve Kolthammer, for sampling a hard distribution even with highly unreliable (but heralded) photon sources, like SPDCs The idea: Say you have 100 sources, of which only 10 (on average) generate a photon. Then just detect which sources succeed, and use those to define your Boson. Sampling instance! Complexity analysis goes through essentially without change Issues: Increases depth of optical network needed. Also, if some sources generate ≥ 2 photons, need a new hardness assumption
Open Problems Prove that Gaussian permanent approximation is #Phard (first step: understand distribution of Gaussian permanents) Can the Boson. Sampling model solve classically-hard decision problems? With verifiable answers? Can one efficiently sample a distribution that can’t be efficiently distinguished from Boson. Sampling? Similar hardness results for other natural quantum systems (besides linear optics)? Bremner, Jozsa, Shepherd 2010: Another system for which exact classical simulation would collapse PH
- Slides: 13