Recent Progress in Quantum Algorithms

Shameless self-promotion: an article I wrote with Wim van Dam, “Recent Progress on Quantum Algorithms” has appeared in the Communications of the ACM. Indeed if you have a copy of the magazine you can check out an artists rendition of a quantum computer/quantum algorithm on the cover. Clearly quantum computing is the new string theory: so abstract that it must be represented by beautiful, yet incomprehensible, figures. Not sure if that’s a good or bad thing. (The article was actually written quite a bit back, so “recent” is a bit off. If we had to write it today I’m guessing we would include the quantum algorithm for linear equations as well as the quantum Metropolis algorithm.)

25 Replies to “Recent Progress in Quantum Algorithms”

  1. Aram’s talk on the linear equations algorithm at QIP was really awesome. It’s such a ubiquitous problem that almost anyone working in science and engineering (and economics) can understand and it was great to see it attacked from a quantum computational standpoint.

  2. Dave, QIP2010 helped me appreciate that a curious inversion is occurring in physics. During the 20th century the “Holy Grail” was classical experimental data predicted by an elegant theory … using (of course) classical resources to generate the prediction.
    Nowadays this paradigm is turning upside down: the “Holy Grail” of physics is —amazingly—classical experimental data that cannot be predicted by *any* theory using classical resources.
    Quantum computers were the first such devices, the classical output data being (famously) factors of large numbers, for example.
    We are not witnessing many more such experiments being designed … but frustrately, none of the (many!) experiments proposed have (so far) turned out to be experimentally feasible.
    It’s enough to make a person wonder whether some censorship principle is at work?
    As Ashtekar and Schilling put it, we may wonder whether “the linear structure which is at the forefront in text-book treatments of quantum mechanics is, primarily, only a technical convenience and the essential ingredients—the manifold of states, the symplectic structure and the Riemannian metric—do not share this linearity.”
    This is the idea behind my two predictions for FOCS2019 on Dick Lipton’s blog that:
    FOCS 2019-I: Simulating noisy quantum systems is generically in the same computational complexity class as simulating classical systems
    FOCS 2019-II: M-Theory is the unique causally separable, relativistically invariant quantum field theory that can be simulated with classical computational resources
    The point being, that the toolset of linear Euclidean geometry remained useful even after it was discovered that classical state-space is curved and smaller than once thought. Similarly, the toolset of linear Hilbert geometry will remain useful even if it is discovered that quantum state-space is curved and smaller than once thought.
    In fact, a quantum universe that *could* be simulated with classical resources, would be fun … and mathematically rich … and it *might* even be the universe that we live in. The jury is still out on this (enjoyable!) question.

  3. “We are not witnessing many more such experiments being designed … but frustrately, none of the (many!) experiments proposed have (so far) turned out to be experimentally feasible.”
    I would say something quite different. The only experiments that physicists do are the ones for which they can compare to numerical evidence. Without a quantum computer, any experiment that would need a quantum computer is not deemed interesting…unless, of course there are other reasons for it to be interesting. Thus high-Tc, non-abelian anyons, etc are interesting but frustrate physicists…they’re exactly the models that physicists would need a quantum computer to simulate.
    I recently heard someone describe topological order as “topological order is a new kind of order which explains all of condensed matter which was what we though Landau theory did before topological order, but this time topological orders are all we need to fix up the theory and explain everything.” How many new phases of matter must we find before we realize that the complexity of nature is far more vast than the textbooks say. We solve what we can solve, the rest is ignored.

  4. Sorry for the typo, Dave … I *meant* to type: “We are NOW witnessing many more such (non-simulatable) experiments being designed … ”
    I definitely agree with the latter half of your post, in which you say “the complexity of nature is far more vast than the textbooks say”. But isn’t this is just as true of classical physics—or even plain old number theory—as it is of quantum physics?
    Here Terry Tao (commenting on Dick Lipton’s blog upon the topic Are The Reals Really Uncountable?) has given a wonderfully interesting list of examples in which nature’s complexity is greater than one might think:
    With regard to the middle part of your post, it’s my impression that quantum experimentalists and quantum simulationists are discovering reasons for similar humility. Two examples: (1) quantum experimentalists no longer to expect to prepare exact ground states (of glassy materials) by subjecting them to a thermal bath, and (2) quantum simulationists no longer expect to simulate (on algebraic state-spaces) quantum post-selection experiments.
    In consequence of these (and other) humility-inducing considerations, actually *doing* experiments that can be actually *proven* to be un-simulatable, is IMHO likely to prove considerably more challenging than present-day QIT theorists optimistically expect!
    The good news for QIT theory (IMHO) is this: the emerging challenge “build an unsimulatable physical device” is just as fundamentally important in terms of math, science, and engineering—and furthermore is broader in scope and more naturally defined in complexity theory—as the older (too rigid?) challenge “build a quantum computer”.
    Surely there are fun times ahead, for quantum mathematicians, quantum scientists, and quantum systems engineers alike. Good! 🙂

  5. “I definitely agree with the latter half of your post, in which you say “the complexity of nature is far more vast than the textbooks say”. But isn’t this is just as true of classical physics—or even plain old number theory—as it is of quantum physics?”
    But the classical experiments are at least tractable on a classical computer. You can do simulations of them, today. Experiments where quantum effects rear their head? A challenge…and easier to ignore.

  6. I don’t think we’re disagreeing, Dave. Despite efforts, no-one has yet done any physical experiment (classical or quantum) whose dataset is provably hard to simulate with polynomial classical resources.
    And this includes quantum simulation challenges that in past decades were regarded as intractable … like hadronic mass measurements.
    Meanwhile, the practical capacity to simulate quantum systems has been increasing comparably rapidly to Moore’s Law (more properly Moore’s Scaling) of computing hardware—and this increase has been sustained over several decades.
    Why is this the case? Hey, *that* is a mystery! 🙂

  7. “I don’t think we’re disagreeing, Dave. Despite efforts, no-one has yet done any physical experiment (classical or quantum) whose dataset is provably hard to simulate with polynomial classical resources. ”
    I think we ARE disagreeing: I think we have done such physical experiments! For example take high-Tc superconductors. Many of the properties of these systems can be explained by a 2D Hubbard model (http://www.springerlink.com/content/2602658404110583/). But if you look at what numerics can do: tractable is not the right word (http://arxiv.org/abs/cond-mat/0610710) Indeed there is the (embarrassing) situation that different numerical models yield different phases for these models!
    But the bigger point is that physicists like to study systems they can build reasonable models of. Often these reasonable models are classical computational models. This BY DEFINITION excludes anything which cannot be so simulated. A physicist might call such systems “messy” and ignore them. But they aren’t necessarily messy…they just are tractable without a quantum computer.
    Certainly no quantum computer has been built which cannot be classically simulated (in the sense of simulating the computation, not in the sense of simulating the physical system.) But this doesn’t mean that there aren’t experiments that aren’t doing intractable things…indeed I’d wager that this is the norm, rather than the exception. The game of experimental physics, however, is stacked so as to not even consider this possibility.

  8. LOL … Dave, with respect to high-Tc superconductors, we’re not disagreeing about the *past* … we’re disagreeing about the future!
    There’s no shortage nowadays (and there never has been) of experimental datasets that are empirically hard to simulate … but (at present) there are *no* experimental datasets that are provably hard to simulate.
    That is why, to my mind, one of the truly great achievements of QIT is to point out that (in principle) it might be possible to attempt such experiments (in fact, Geordie, isn’t that an important aspect of D-Wave’s development program?)
    This has created that rare-and-valuable entity, a *new* class of physical experiments for mathematicians, scientists, and engineers to think about … and try to carry through.
    The proofs that the QIT folks contribute have especial value IMHO, because (as mathematician Henry George Forder said) “The virtue of a logical proof is not that it compels belief but that it suggests doubts. The proof tells us where to concentrate our doubts.”
    Similarly, no-go theorems from QIT do much tell quantum simulationists where to look for new algorithms … and these rigorously-grounded insights are immensely valuable.

  9. I might question the “rigorously grounded” part when applied to many QIT results. It seems to me that a lot of the theorem proving that happens occurs in experimentally non-interesting limits. For example, when Umesh said that computer science was the backbone of quantum computing I nearly barfed my expensive swiss chocolate. Real (ie experimental) quantum computing is about condensed matter physics, not computer science. Any result that doesn’t explicitly include temperature, for example, I wouldn’t trust.

  10. > Certainly no quantum computer has been built which cannot be classically simulated (in the sense of simulating the computation, not in the sense of simulating the physical system.)
    I’m not so sure you’re correct about this Dave. While I understand what you’re saying, I think that the inclusion of the environment adds a level of difficulty to the simulation exercise that might push our bigger chips out of the simulatable in principle range. IE simulating 128 quantum spins you can do with quantum monte carlo, but what do you do with 128 quantum spins interacting with even a simple thing like an ohmic markovian oscillator bath? In general you also have to include even harder things like 1/f noise which is non-markovian.

  11. Now you guys have done it … raised issues that I’ll have to actually *think* about!
    Seriously, Dave, thank you very much for hosting this great blog … your Com. ACM article was immensely thought-provoking too … and thanks to everyone who is commenting (Geordie especially).

  12. “Any result that doesn’t explicitly include temperature, for example, I wouldn’t trust.”
    Do you say similar things for classical computers? “I don’t trust quicksort because it doesn’t run at T not equal to zero.” Sorry that seems like a very narrow way to approach the world.
    Indeed I would say that you don’t really have a quantum COMPUTER unless you can forget about temperature. Before then what you’ve got is some pseudo-computer thingee that can be used to a limit, but then what? D-wave, the quantum pseudo-computer company?

  13. It just seems to me that boring/irrelevant work is boring/irrelevant work…independent of whether it has theorem proving or not. Physics has just as much worthless work done in irrelevant limits as CS does. But to deny that proving correctness of algorithms, understanding the limits and power of computing using a rigorous approach seems wrong.
    Oh, and remind me not to let condensed matter physicists write computer code 🙂

  14. LOL … Dave, today you & I are flowing in similar directions … I’m studying Wikipedia’s (compact but opaquely abstract) article on “tautological one-forms” … with a view to purging our quantum simulation codes of shameful coordinate-dependence.
    Thus far, it’s not the simulation code that’s dumping core … but my cerebral cortex. 🙂

  15. > Oh, and remind me not to let condensed matter physicists write computer code 🙂
    REAL coders do everything by brute force. Efficiency is for the weak. Man’s game.

  16. “Do you say similar things for classical computers? “I don’t trust quicksort because it doesn’t run at T not equal to zero.” Sorry that seems like a very narrow way to approach the world.”
    Yes, some of us do! When analysing computational complexity of, say, integer addition (people who design adders have to do it, theoretical CS types assume it is O(1) in time all the time 🙂 ), not much changes if one uses bits or trits, binary or ternary or whatever other reasonably small base. But it has to be reasonably small for physical (ultimately, Johnson noise, i.e., temperature) reasons. After this is established, fast adder logic design can be handed over to computer scientists and they will tell you that you need space of, say, N log_{base} N and time log_{base} N to add N-digit numbers and any fixed base would influence only pre-factors.
    But allowing T=0 for classical logic would allow infinitely large computational bases. “You want to add two N-bit integers? Well, in base 2^N it is O(1) operation!” or some such formally correct, but totally useless statement! 🙂
    Paul B.

  17. Seriously, with regard to N^2 algorithms, the modern push toward simulationist S&E is driven (at the mathematical level) in large measure by a (boring?) algorithmic improvement N^3 ‚Üí N^2 (where N is the number of finite elements) associated with the advent of iterative numerical solvers.
    The literature is somewhat sparse on this, because these algorithms (especially their preconditioners) are among the most closely-held proprietary secrets of companies like ANSYS. And if you don’t perceive these N^3 ‚Üí N^2 improvements as having substantial economic value … well … you haven’t check ANSYS’ 10-year stock history! The plain fact is, enterprises like Boeing’s 787 simply don’t fly without these N^3 ‚Üí N^2 S&E simulation tools.
    That’s why a big part of the rationale for our QSE Group’s adoption of forms-and-flow frameworks, is to adapt these N^3 ‚Üí N^2 algorithmic advances to quantum S&E.

Leave a Reply

Your email address will not be published. Required fields are marked *