Quantum Computer Dollars

How much would you pay for a quantum computer?
Of course, it depends on exactly what this quantum computer can do, doesn’t it! If I give you a two qubit quantum computer, you may not want to pay me more than two bits (25 cents people, not two binary numbers.) Which is not to belittle the experiments that have been done to date which are few qubit quantum computers…these are among the most impressive works of experimental physics/engineering around. But I certainly wouldn’t pay much for the computational power these experiments demonstrate.
There are sort of two regimes where I think someone might actually want to buy a quantum computer. The first is when a quantum computer with around 100 qubits or so which can process some thousands of parallel operations before the computer decoheres/errors. Why would I be interested in such a machine? Well because I have no idea how to efficiently simulate some quantum systems of this size. Why do I go up to 100 qubits and not as some smaller number like 20 or 30. Certainly simulating quantum systems of this size is difficult. However, the systems which we would really like to use a quantum computer to simulate, those with a large amount of entanglement, are probably two (or higher) dimensional systems, and getting to a two dimensional system of ten by ten seems like a regime where I can at least begin to rid myself of some small finite size effects.
The next step, of course, is a full scale quantum computer, one which is operating below the threshold for fault-tolerant quantum computation. What price should we assign such a device. Again it depends on the exact specs. But let’s just assume that this quantum computer has a few kilobytes of quantum memory. What will the clock speed of our quantum computer be? Well it will certainly depend on the physical implementation. And there is the overhead of quantum error correction. So the clock speed may range anywhere from MHz, to even PHz. How much would you pay for such a quantum computer?
For comparison, IBM’s Blue Gene, the worlds fastest supercomputer (that we know about) today, cost around one hundred million dollars.
Let the bidding begin!
The qBabbage: 100 qubit quantum computer, with the ability to perform, say 1000 operations before decoherence/noise ruins a quantum simulation. Start bids at 10 thousand dollars.
The qMark I: A fault-tolerant quantum computer with 2 kilobytes of quantum memory and a clock speed of MHz. Start bids at half a million dollars.
The qWhirlwind: A fault-tolerant quantum computer with 2 kilobytes of quantum memory and a clock speed of THz. Start bids at one million dollars.

Poor Pluto

Looks like Pluto’s got some competition.

Two sets of astronomers have spotted a new planetoid in the outskirts of our Solar System. It is the brightest object in the region after Pluto, and it has its own small moon.

In recent years astronomers have spotted several Kuiper-belt planetoids, including ones named Quaoar and Varuna; the latest has been nicknamed Santa. Philosophical debates continue about how large such objects have to be before we call them ‘planets’ rather than simple lumps of rock

Funny, I thought the earth was simply a lump of rock. Am I wrong? Is the earth really made of cheese or some other non-rock substance? And what’s with the philosopher bashing? Surely philosophers do more than just debate what one should label a planet! 😉

Been Around the World, and I, I, I…

One of the cool things about being a scientist these days is that the level of international collaboration is fairly high, and this means that one gets to make exciting trips to exciting lands. Last week I booked some travel for the end of the summer. Italy and Singapore. What a rough rough life!

OMG My Classical Probability Distrubution Collapsed!

Scott Aaronson has a nice diatribe “Are Quantum States Exponentially Long Vectors?” which he’s posted on the arXiv as quant-ph/0507242. In this note he discusses his own personal counter to certain objections to quantum computation. It’s a very nice read.
My favorite part of the article is where Scott comes out as a full on epistemologist:

To describe a state of n particles, we need to write down an exponentially long vector of exponentially small numbers, which themselves vary continuously. Moreover, the instant we measure a particle, we “collapse” the vector that describes its state—and not only that, but possibly the state of another particle on the opposite side of the universe. Quick, what theory have I just described?
The answer is classical probability theory. The moral is that, before we throw up our hands over the “extravagance” of the quantum worldview, we ought to ask: is it so much more extravagant than the classical probabilistic worldview?

To which I can only say “Amen, brother!” I think physicists, in particular, are VERY bad at understanding this argument.
Suppose we want to write down a theory which describes the state of classical bits. One can certainly pretend that the classical bits are always in some definite state, but now ask how do we describe the state of our classical bits when we carry out an operation like, flip a fair coin, and conditional on the outcome set a bit to zero or one? We then need probabilities to describe out classical set of bits. If we have n classical bits, then the probability vector describing such a classical system will be made up of two to the power n numbers (the probabilities.) The number of numbers needed to describe a classical n bit system is exponential in the number of bits! So should we be surprised that quantum computing requires states described by an exponential numbers of complex amplitudes? Doesn’t seem as surprising now, does it?
And there are a bunch of other similarities between probabilistic computation and quantum computation. If we measure such a classical system, we certainly get one of the bit strings, and out description immediately changes to a probability distribution with only one nonzero entry: the probability distribution collapses. Similarly if we perform a single measurement, we don’t learn the probabilities themselves, i.e. we don’t learn these (real) numbers describing the classical state.
Another interesting analogy (which can only be pushed so far…and this is the real interesting part!) is with correlated bits. Suppose I flip a fair coin and if the outcome is heads I put two bits which are both zero into two boxes. If the outcome is tails, I put two bits which are both one into two boxes. What is our description of the classical probabilistic state of these two boxes? We say 50% 00 and 50% 11. Now carry these boxes to the far ends of the universe. Open one of the boxes. Well, opening this box, I immediately know that whatever is in this box, well the other bit, on the other side of the universe, well it must have the same value as my bit. Communication faster than light? No! Correlated bits? Yes! As a global observor, we can update our description of the system after a measurement by appropriately collapsing the probability distribution. Notice that until information is communicated about the measurement from one party to the other, the left out party can’t change his/her description of their system (or of the global system). Quantum entanglement is a “bit” like this…but the surprising thing is that it turns out to be different! How different? Well this is the subject of Bell’s theorem and, needless to say the end result is one of the essential differences between classical probabilistic computation and quantum computation. But the fact that quantum theory is a consistent way to describe probability amplitudes is directly analogous to the manner in which classical probabilistic description work!
There are even more similarities between quantum computation and probabilistic classical computation. For example, there is a classical analogy of teleportation. It works out to be one time pads!
Notice that to get these interpretations of the similarites between classical probabilistic computation and quantum computation, we need to adopt a particular stance towards quantum theory. This is the epistemological view of quantum theory. In this view, roughly, the wave function of a quantum system is merely a description of a quantum system. It is not, say, like the classical position of a particle, which is a real number which we can really assign as a property of that classical system. I must say that I find myself very much in tune with this view of quantum theory. This does not mean, however, that this point of view totally solves all the problems people have with quantum theory. In particular, the problems of contextuality and no local hidden variable theory remain “troubling” and the question of “a description of WHAT?” is roughly the measurement problem. I certainly think that among quantum computing theorists, roughly this point of view is gaining more and more adherents. Which is good, because any mention of many worlds is destined to make me go crazy!
As a side note, when I started teaching the quantum computing course this summer, I attempted to teach quantum theory from the epistemological point of view. Unfortunately, the pace I set was too fast, and so I had to change tactics. But it certainly would be interesting to try to teach quantum theory from this perspective.
A final quote from Scott:

For almost a century, quantum mechanics was like a Kabbalistic secret that God revealed to Bohr, Bohr revealed to the physicists, and the physicists revealed (clearly) to no one. So long as the lasers and transistors worked, the rest of us shrugged at all the talk of complementarity and wave-particle duality, taking for granted that we’d never understand, or need to understand, what such things actually meant. But today—largely because of quantum computing—the Schr¨odinger’s cat is out of the bag, and all of us are being forced to confront the exponential Beast that lurks inside our current picture of the world. And as you’d expect, not everyone is happy about that, just as the physicists themselves weren’t all happy when they first had to confront it the 1920’s.

Which I really like, but I must take issue with. It’s all the physicist’s fault for not clearly communicating?! I don’t think so. I think computer scientists were too busy with other important things, like, say inventing the modern computer and building modern complexity theory, to even bother coming over and talking with us physicists about quantum theory. Just because you weren’t paying attention doesn’t mean you get to say that physicists weren’t communicating clearly! Notice that it was basically three physicists, Benioff, Feynman, and Deutsch, who first really raised the question of what exactly a quantum computer would be. Of course it took computer scientists, like Bernstein, Vazirani, Simon, and Shor to actually show us the path forward! But I think someone just as easily could have thought up quantum computation in 1950 as in 1980. The reason why it took so long to dream up quantum computers probably has more to do with the fact that no one, physicists or computer scientists, could really imagine doing the kinds of experiments which quantum computers represent. Of course, none of this really matters, but it’s fun to yell and scream about this and pretend that it makes some sort of difference, when really its just fun and funny.

Make It Planar

Steve Flammia points me to this cool game. Well at least it’s cool if you are the computer science type.

Beyond Moore's Law

If Moore’s law continues at its current pace, sometime between 2040 and 2050 the basic elements of a computer will be atomic sized. And even if Moore’s law slows, we eventually hope that computers will be made of components which are atomic sized. Either way, we really believe that it might be possible to get to “the end of Moore’s curve.” A question I like to ask (especially to those employed in the computer industry) is “What will happen to your job when we hit the atomic size barrier for computer components?” (or more interestingly, “will you still have a job when Moore’s law ends?” Yes, I know, software is important, architecture is important, etc. I still think the end of Moore’s law will result in major changes in the industry of computers.)
One question which comes up when we think about the end of Moore’s law is that in some sense, the end of Moore’s law that we’re talking about is the end of a particular manner of creating fast computing devices. Mostly we are thinking about the end of silicon based integrated circuits, and even more broadly we are thinking about the end of transistor based computers (i.e. we include both silicon based circuits and also molecular transistors, etc.) And the fundamental speed of these devices is rooted in physics. So what can physics tell us about what lies beyond Moore’s law?
Well first of all, the length scales involved in the traditional version of Moore’s law are atomic length scales. The barrier we are hitting is basically the barrier set by the laws of atomic physics. But we know, of course, that there are smaller length scales possible. In particular the next step down the ladder of sizes is to go to nuclear length scales. But we also need to say something about the speeds of operations. What are the limits to the speeds of gates which an atomic physics based computer can operate at? Interestingly, there is often a lot of confusion about this question. For example, suppose you are trying to drive an atomic transition (this is nothing like our transistors, but bare with me.) with your good old tabletop laser. The speed at which you can drive this transition is related to the intensity of the laser beam. So it might seem, at first guess that you can just keep cranking up the intensity of the laser beam to get faster and faster transitions. But eventually this will fail. Why? Because as you turn up the intensity of the laser beam you also increase the probability that your system will make a transtion to a state you don’t want it to be in. This may be another energy level, or it may be that you blow the atom appart, or you blow the atom out of whatever is keeping it in place, etc. Now, generally the intensity of the laser beam at which this becomes important is related to the energy spacing in the atomic system (if, say you are trying not to excite to a different level.) Note that the actually energy spacing of the levels you are driving is NOT the revelant information, but most of the time, this spacing is the same order of magnitude of the spacings to levels you are trying to avoid. So this allows us to, roughly, argue that the maximum speed we will achieve for our transition is Plancks constant divided by the energy spacing.
Now for atomic systems, the energy levels we are talking about might be, say a few electron Volts. So we might expect that our upper limit of speeds from our gate is something like 10^(15) Hz. Note that today’s computers, which don’t operate by driving atomic transitions, but in a different manner, operate with clock speeds of 10^(9) Hz (yeah, yeah, clock speed is no guarantee of instructions per second, but I’m a physicist, so order of magnitude is my middle name.) Only 6 more orders of magnitude to go!
So what does happen if we hit the end of atomic sized computing devices? As I mentioned the next step on the length scale slash energy scale are nuclear systems. Here we find energy scales which are typically millions of electron Volts. But I have absolutely no idea how to build a computer where internal nuclear states are used to compute. Which, doesn’t mean that it’s impossible, of course (which reminds me of a great quote by the late great John Bell: “what is proved by impossibility proofs is lack of imagination.”) So there’s a good problem for a nuclear physicist with a few spare moments: think up a method for computing using nuclear transitions.
One can continue up the energy scale, of course. But now it gets even more far out to imagine how to get the device to compute. Is it possible to turn the large hadron collider, currently being built at CERN, into a computer opperating at 10^(27) Hz (energies of terra electron Volts)? Now THAT would be a fast computer!

A Fork In the Road for Ion Traps

Big news for ion trap quantum computers. It seems that Christopher Monroe’s ion trap group at the University of Michigan has suceeded in getting ions to shuttle around the corner of a T in their ion traps (their news item is dated 6/11/05, for this result.) This is, needless to say, a crucial step in building a “reasonable” architecture for quantum computing. This kind of thing makes me want to give up my theory license and jump into the lab!

Paper and Book Roundup

Some interesting papers.
First, a paper by Andrew Childs and Wim van Dam, “Quantum algorithm for a generalized hidden shift problem”, quant-ph/0507190 which gives a very nice, new algorithm for, well, for what it says: hidden shift problems! Interestingly their new algorithm uses Lenstra’s classical integer programing algorithm to implement an entangled measurement on the quantum states they set up. I just started reading the paper this morning. Once I parse it, I may have more to post.
Another interesting paper, is “Rigorous location of phase transitions in hard optimization problems” which is, amazingly, a computer science article published in…Nature. If you read this paper and are a physicist, it will make you very proud:

Our results prove that the heuristic predictions of statistical physics in this context are essentially correct.

In other words…yeah the physicists are actually really good at guessing what approximations to make! The paper is nice as well, rigorously proving some nice properties of random instances of certain NP-complete problems.
Finally, I received in the mail yesterday “Probability Theory” by E.T. Jaynes. This book, in incomplete form, had been available on the web for many years. Following Jaynes’ death, G. Larry Bretthorst was able to collect some (but not all) of this material into “Probability Theory.” Unfortunately, Jaynes’ had intended to have two volumes, and it seems that the second volume was woefuly incomplete and so will not be published.