Scott Aaronson has a nice diatribe “Are Quantum States Exponentially Long Vectors?” which he’s posted on the arXiv as quant-ph/0507242. In this note he discusses his own personal counter to certain objections to quantum computation. It’s a very nice read.
My favorite part of the article is where Scott comes out as a full on epistemologist:
To describe a state of n particles, we need to write down an exponentially long vector of exponentially small numbers, which themselves vary continuously. Moreover, the instant we measure a particle, we âcollapseâ the vector that describes its stateâand not only that, but possibly the state of another particle on the opposite side of the universe. Quick, what theory have I just described?
The answer is classical probability theory. The moral is that, before we throw up our hands over the âextravaganceâ of the quantum worldview, we ought to ask: is it so much more extravagant than the classical probabilistic worldview?
To which I can only say “Amen, brother!” I think physicists, in particular, are VERY bad at understanding this argument.
Suppose we want to write down a theory which describes the state of classical bits. One can certainly pretend that the classical bits are always in some definite state, but now ask how do we describe the state of our classical bits when we carry out an operation like, flip a fair coin, and conditional on the outcome set a bit to zero or one? We then need probabilities to describe out classical set of bits. If we have n classical bits, then the probability vector describing such a classical system will be made up of two to the power n numbers (the probabilities.) The number of numbers needed to describe a classical n bit system is exponential in the number of bits! So should we be surprised that quantum computing requires states described by an exponential numbers of complex amplitudes? Doesn’t seem as surprising now, does it?
And there are a bunch of other similarities between probabilistic computation and quantum computation. If we measure such a classical system, we certainly get one of the bit strings, and out description immediately changes to a probability distribution with only one nonzero entry: the probability distribution collapses. Similarly if we perform a single measurement, we don’t learn the probabilities themselves, i.e. we don’t learn these (real) numbers describing the classical state.
Another interesting analogy (which can only be pushed so far…and this is the real interesting part!) is with correlated bits. Suppose I flip a fair coin and if the outcome is heads I put two bits which are both zero into two boxes. If the outcome is tails, I put two bits which are both one into two boxes. What is our description of the classical probabilistic state of these two boxes? We say 50% 00 and 50% 11. Now carry these boxes to the far ends of the universe. Open one of the boxes. Well, opening this box, I immediately know that whatever is in this box, well the other bit, on the other side of the universe, well it must have the same value as my bit. Communication faster than light? No! Correlated bits? Yes! As a global observor, we can update our description of the system after a measurement by appropriately collapsing the probability distribution. Notice that until information is communicated about the measurement from one party to the other, the left out party can’t change his/her description of their system (or of the global system). Quantum entanglement is a “bit” like this…but the surprising thing is that it turns out to be different! How different? Well this is the subject of Bell’s theorem and, needless to say the end result is one of the essential differences between classical probabilistic computation and quantum computation. But the fact that quantum theory is a consistent way to describe probability amplitudes is directly analogous to the manner in which classical probabilistic description work!
There are even more similarities between quantum computation and probabilistic classical computation. For example, there is a classical analogy of teleportation. It works out to be one time pads!
Notice that to get these interpretations of the similarites between classical probabilistic computation and quantum computation, we need to adopt a particular stance towards quantum theory. This is the epistemological view of quantum theory. In this view, roughly, the wave function of a quantum system is merely a description of a quantum system. It is not, say, like the classical position of a particle, which is a real number which we can really assign as a property of that classical system. I must say that I find myself very much in tune with this view of quantum theory. This does not mean, however, that this point of view totally solves all the problems people have with quantum theory. In particular, the problems of contextuality and no local hidden variable theory remain “troubling” and the question of “a description of WHAT?” is roughly the measurement problem. I certainly think that among quantum computing theorists, roughly this point of view is gaining more and more adherents. Which is good, because any mention of many worlds is destined to make me go crazy!
As a side note, when I started teaching the quantum computing course this summer, I attempted to teach quantum theory from the epistemological point of view. Unfortunately, the pace I set was too fast, and so I had to change tactics. But it certainly would be interesting to try to teach quantum theory from this perspective.
A final quote from Scott:
For almost a century, quantum mechanics was like a Kabbalistic secret that God revealed to Bohr, Bohr revealed to the physicists, and the physicists revealed (clearly) to no one. So long as the lasers and transistors worked, the rest of us shrugged at all the talk of complementarity and wave-particle duality, taking for granted that weâd never understand, or need to understand, what such things actually meant. But todayâlargely because of quantum computingâthe Schr¨odingerâs cat is out of the bag, and all of us are being forced to confront the exponential Beast that lurks inside our current picture of the world. And as youâd expect, not everyone is happy about that, just as the physicists themselves werenât all happy when they first had to confront it the 1920âs.
Which I really like, but I must take issue with. It’s all the physicist’s fault for not clearly communicating?! I don’t think so. I think computer scientists were too busy with other important things, like, say inventing the modern computer and building modern complexity theory, to even bother coming over and talking with us physicists about quantum theory. Just because you weren’t paying attention doesn’t mean you get to say that physicists weren’t communicating clearly! Notice that it was basically three physicists, Benioff, Feynman, and Deutsch, who first really raised the question of what exactly a quantum computer would be. Of course it took computer scientists, like Bernstein, Vazirani, Simon, and Shor to actually show us the path forward! But I think someone just as easily could have thought up quantum computation in 1950 as in 1980. The reason why it took so long to dream up quantum computers probably has more to do with the fact that no one, physicists or computer scientists, could really imagine doing the kinds of experiments which quantum computers represent. Of course, none of this really matters, but it’s fun to yell and scream about this and pretend that it makes some sort of difference, when really its just fun and funny.