More Dice

The full t’ ‘t Hooft (look I put the apostrophy in the correct location!) article is now posted at Physics World (not Physics Today, as I listed incorrectly in my first post) commentary by Edward Witten, Fay Dowker, and Paul Davies. Quick summary: Witten thinks that quantum cosmology is perplexing, Dowker worries about the emergence of classical physics, and Davies postulates that complexity is the key to understanding the emergence of classicality. Davies suggests that quantum mechanics will break down when the Hilbert space is of size 10^120 and suggests that quantum comptuers will fail at this size. His argument could equally be applied to probablistic classical computers, and so I suggest that if he is right, then classical computers using randomness cannot be any larger than 400 bits.

SQuInT 2006

My favoritest ( 😉 ) meeting SQuInT will be having it’s nineth conference in Albuquerque February 17-19. Whoop, another chance to visit the rattlesnake museum!

Dicey t' Hooft

Does God Play Dice? is not a treatise on religion and gambling, but is instead Gerard ‘t Hooft’s submission to Physics Today Physics World concerning the reconcilliation of quantum theory with general relativity. The most interesting part of this short note is not that ‘t Hooft comes out squarely on the side of hidden variable theories, but instead in his description of an idea for how general relativity might arise in such a theory:

An even more daring proposition is that perhaps also general relativity does not appear in the formalism of the ultimate equations of nature. This journal does not allow me the space to explain in full detail what I have in mind. At the risk of not being understood at all, I’ll summarize my explanation. In making the transition from a deterministic theory to a statistical treatment — read: a quantum mechanical one —, one may find that the quantum description develops much more symmetries than the, deeper lying, deterministic one. If, classically, two different states evolve into the same final state, then quantum mechanically they will be indistinguishable. This induces symmetries not present in the initial laws. General coordinate covariance could be just such a symmetry.

That general coordinate covariance may not be fundamental but is instead a product of our inability to access the beables of a theory seems like quite an interesting idea. It would be interesting to think if this type of hidden variable theory, which is not totally general because it needs to recover the general coordinate covariance, is indeed large enough to be consistent with quantum theory. I.e. in the same way the Bell’s theorem rules out local hidden variable theories, is there a similar theorem ruling out ‘t Hooft’s property? I certainly have no inclination about the answer to this question in either direction.
Of further interest, ‘t Hooft claims as motivation for his perspective the following

Nature provides us with one indication perhaps pointing in this direction: the unnatural, tiny value of the cosmological constant. It indicates that the universe has the propensity of staying flat. Why? No generally invariant theory can explain it. Yet, if an underlying, deterministic description naturally features some preferred flat coordinate frame, the puzzle will cease to perplex us.

Finally, for no reason but to turn some portion of the readers of this blog happy and the other portion of this blog angry, here is ‘t Hooft on string theory:

I am definitely unhappy with the answers that string theory seems to suggest to us. String theory seems to be telling us to believe in “magic”: duality theorems, not properly understood, should allow us to predict amplitudes without proper local or causal structures. In physics, “magic” is synonymous to “deceit”; you rely on magic if you don’t understand what it is that is really going on. This should not be accepted.

I wish I understood what ‘t Hooft means in this critique by “proper local or causal structures.”

Shor's Legacy

Quantum computation was put on the map, so to speak, with Peter Shor’s discovery in 1994 that quantum computers could efficiently factor numbers. It is interesting to contemplate how this discovery will be viewed in our distant future. We are now over ten years out from Shor’s discovery. Perhaps this is too short of time to make a judgement, put let’s be arrogant and try to make such a historical judgement. Along these lines, I’d claim that Shor’s algorithm is one of the most significant discoveries about algorithms in modern history. There are a certain group of algorithms, like Euclid’s algorithm for computing a greatest common denomenator, which, in my mind are among the most beautiful, eternal algorithms which we know. (Algorithms from the code book, so to speak.) I would like to make the claim that Shor’s algorithm belongs in the same category as these algorithms. First of all Shor’s algorithm solves a problem, factoring, which feels like one of those problems which lies very close to the base of mathematics. Certainly most of us learned about factoring numbers at a very young age, and indeed the view of a number as a product of its factors shapes in a deep way the manner in which we think about integers. Second, Shor’s algorithm use of period finding reveals in a deep way how quantum theory changes the way we can process information. That global properties of symmetric quantum states can be extracted from quantum systems, but not for their equivalent classical systems, is a deep observation and one which we are only now beginning to push in new directions.
So, I’ll claim, Shor’s algorithm is a very profound discovery which will be thought of for centuries to come as one of our humanities most beautiful discoveries. Interestingly, I think, this also puts a huge cloud over those of us working to try to discover new quantum algorithms. For instance, Sean Hallgren’s discovery that quantum computers can efficiently solve Pell’s equation is an awesome result, but viewed in light of factoring, well it’s just hard to compete! Certainly one effect of this is that looking for new quantum algortihms has gained a reputation as being a very difficult field. And indeed, if you define difficult as coming up with something which is more deep that Shor’s algorithm, then working towards new quantum algorithms can seem a hopeless task. Indeed this is one of ther reasons there has been so much focus on the hidden subgroup problem: an efficient algorithm for this problem would lead to efficient algorithms for the graph isomorphism problem and certain k-shortest vector in a lattice problems, and these are big problems, whose solution would represent big progress in algorithms. So we in quantum algorithms, set our sights extermely high, because Shor set a bar which is at world-record pole-vault heights. Certainly this point of view argues for working on much smaller progress in quantum algorithms: understanding more basic simple primitives even if they don’t lead to algorithms which are “better than Shor.” I, myself, struggle with thinking along these lines: our expectations are high and it is hard to not try to jump over the Shor barrier. But perhaps those with better self control can move in these directions and maybe, if we are lucky this progress will eventually show us new ways that quantum computers can outperform they nasty rivals, classical computers.

Edgy Deutsch

Via Geomblog, I see that David Deutsch has won the 2005 $100,000 Edge of Computation Science Prize (see the post here.) Here is the award citation:

Although the general idea of a quantum computer had been proposed earlier by Richard Feynman, in 1985 David Deutsch wrote the key paper which proposed the idea of a quantum computer and initiated the study of how to make one. Since then he has continued to be a pioneer and a leader in a rapidly growing field that is now called quantum information science.
Presently, small quantum computers are operating in laboratories around the world, and the race is on to find a scalable implementation that, if successful, will revolutionize the technologies of computation and communications. It is fair to say that no one deserves recognition for the growing success of this field more than Deutsch, for his ongoing work as well as for his founding paper. Among his key contributions in the last ten years are a paper with Ekert and Jozsa on quantum logic gates, and a proof of universality in quantum computation, with Barenco and Ekert (both in 1995).
One reason to nominate Deutsch for this prize is that he has always aimed to expand our understanding of the notion of computation in the context of the deepest questions in the foundations of mathematics and physics. Thus, his pioneering work in 1985 was motivated by interest in the Church-Turing thesis. Much of his recent work is motivated by his interest in the foundations of quantum mechanics, as we see from his 1997 book.

Congrats to David Deutsch!

Qubiting for Dollars

Rod Van Meter makes the prediction that the first production quantum computer will cost forty bucks a qubit. He arrives at this number by assuming the first application will be factoring a 1,024 bit number which requires about five kilobits (kiloqubits?) and adds in a factor of fifty for quantum error correction. Thus there is a total of one quarter of a million qubits and figures such a machine could cost around ten million for the figure of forty bucks a qubit. It’s interesting to reason about this by first setting the cost instead of by estimating the actual costs. Why I like this approach is that if we suppose that a similar quantum computer costs ten times as much, then we simply won’t be building such a computer. Of course, I’m not sure how much the NSA would pay for a quantum computer: it may indeed be in the hundred million dollar range if factors 1,024 bit numbers.
Of course, I don’t believe in quantum error correction…or rather I don’t believe we will use the standard approach to quantum error correcting to obtain a fault-tolerant quantum computer 😉 . Thus I’m betting the first quantum computer will cost around a dollar a qubit (although I certainly think some of these “qubits” may in fact be made up of hundreds to thousands to millions to 10^19 number of single quantum systems.)
It will be interesting to see how far off Rod and I are at date Q. I suspect Rod will be closer mostly because he’s actually worked in the real world.
Update: For comparison the ENIAC cost about half a million dollars in 1945 which is about five million dollars in today’s money. The total number of vacuum tubes in that monster was around 20,000. Thats 500 of today’s dolars per vacuum tube. And, of course, today, I can buy a computer with 50 million more than a billion transistors for under five hundred bucks!

Quantum Audio Upgrade

Via a student from the class I taught this last summer, The Intelligent Chip:

The Intelligent Chip is a one inch square, bright orange plastic wafer that, when placed on top of a compact disc player for 2-3 seconds, upgrades the disc (CD, DVD or SACD) being played at the time. The sound of the upgraded disc has more detail and articulation, better dynamics and an absence of “digital harshness.” Voices are more human-sounding and less synthetic. The upgrade is permanent. Inside the white translucent Intelligent Chip case are two ultrathin, clear polycarbonate sheets, one on each side of the Chip. The manufacturer’s product brochure states, “The Intelligent Chip should be put back into the packing case after using, because the packing case can protect the quantum material of the Intelligent Chip, preventing them from leaking.”

Read the description of how it works. Who says quantum information science isn’t going to revolutionize the world….of audio devices. Why does my mouth taste like snake oil?

Computation….On the Edge

Via Three Quark’s Daily (Update: and also at Geomblog, which I somehow missed reading, but which has some great comments about quantum information science): The Edge (which you either love or hate) has announced a $100,000 prize in computation science The list of nominations for the first prize was announced at Festival della Scienza in Genoa on Novermber 1st.
The list of those nominated is a pretty interesting. From quantum computing land, those nominated are

CHARLES H. BENNETT, for his many ongoing contributions to the physics of information, including reversible computation, quantum cryptography, quantum teleportation, and quantum communication theory.
DAVID DEUTSCH, for the enormous potential of quantum computing in studying the architecture of the brain.
SETH LLOYD, for turning quantum computers from dream into device.
PETER SHOR, for his discovery of revolutionary algorithms for quantum computation, which will hasten the day when this fundamentally new mode of computation becomes practicable.

That nomination for David Deutsch seems a bit strange to me: it sounds like it was written for Roger Penrose! I would have nominated David Deutsch for fundamental work developing the idea of a quantum computer, realizing that quantum computers could outperform classical computers, and for fundamental insights connecting physics to the foundations of computer science.

Superpositions of Worms Going In and Out

Over at Fact and Fiction there is an interesting post about an article in New Scientist called Attack of the Quantum Worms.
Having not read the article, I thought I’d be an idiot and comment on what one could possibly mean by the statement that quantum computers are more susceptible to malicious attacks. I’m an even bigger idiot because I know very little about computer software security. But life’s about saying silly things and then getting chewed out by all you smart people, know?
The first thing one might mean is that quantum data cannot be backed up due to the no cloning theorem. OK, suppose I’m a hacker who wants to attack your quantum computer. Surely if I write some code which is executed by the computer and it damages the quantum state of the system, then, because I can’t back up the state of the quantum computer, I’m in big trouble. So it seems that one cannot prevent crashing a quantum computation by backing the state of the system up. But this isn’t quite right, is it? I mean suppose we have a classical computation and we make two backups of the state of the computation at any time. Then we can use this to test whether one of these states has been corrupted by majority voting. But we’ve learned that we can do exactly the same thing for quantum computers: we can detect if the quantum state of one of our copies has been corrupted by encoding into a quantum computer. Now we need to use more than three qubits per bit, but still, this doesn’t seem like such a big deal (from a theorist perspective 😉 ) Now I’d also like to say that I don’t think this is even the way classical computers protect themselves versus malicious software.
So what about attacking the “quantum software?” Well in a standard circuit model of quantum computation, the software is just a list of classical expressions. There is nothing quantum about the data describing a quantum computing program. So it seems that there can’t be any difference here between the quantum and classical world.
Oh well, I really should get a copy of the New Scientist and figure out if any of this rambling has anything to do with the article (on a similar note, I don’t understand the second part of the post at Fact and Fiction. I just don’t see the relevance of copying in a fixed basis: this is something which we almost always avoid in quantum computing because it destroys the coherence properties of the subsystem being so copied.)

Your Symmetry Broke My Quantum Computer?

An article in Scientific American (of all places….I stopped reading Scientific American when they started a section on science/pseudoscience. Sure I agree with them, but I don’t want to read a science magazine to read about how science is different from pseudoscience, I already know that. Plus they stopped the amateur science section and mathematical recreations section: really the two best reasons to read Scientific American in the good old days) on a mechanism for decoherence due to symmetry breaking.

Jeroen van den Brink and his colleagues at Leiden University in the Netherlands, however, suggest that even perfect isolation would not keep decoherence at bay. A process called spontaneous symmetry breaking will ruin the delicate state required for quantum computing. In the case of one proposed device based on superconducting quantum bits (qubits), they predict that this new source of decoherence would degrade the qubits after just a few seconds.

The paper in question, published in Physical Review Letters (and available as quant-ph/0408357cond-mat/0408357) presents an interesting mechanism for decoherence. What is most interesting about this decoherence mechanism is the rate they obtain for decoherence: [tex]$t_D={N h over k_B T}$[/tex], where N is the number of microscopic degress of freedom, and h, k_B, and T should be recognizable to every physicist 😉
What does this mean for quantum computers? Well the above might indicate that this is some fundamental limit for quantum computing (and in particular for superconducting implementations of quantum computers for which this result will hold). But I don’t think this is true. I’ll let the article explain why:

Not everyone agrees that the constraint of a few seconds is a serious obstacle for superconducting qubits. John Martinis of the University of California at Santa Barbara says that one second “is fine for us experimentalists, since I think other physics will limit us well before this timescale.” According to theorist Steven M. Girvin of Yale University, “if we could get a coherence time of one second for a superconducting qubit, that would mean that decoherence would probably not be a limitation at all.” That is because quantum error correction can overcome decoherence once the coherence time is long enough, Girvin argues. By running on batches of qubits that each last for only a second, a quantum computer as a whole could continue working indefinitely.