An interesting question at the talk Sunday was, suppose you build a quantum computer, how do you really know that these crazy quantum laws are governing the computer? What I love about this question is that you can turn it around and ask the same question about classical computers! At heart we believe that if we put together a bunch of transistors to form a large computation, each of these components is behaving itself and there is nothing new going on as the computer gets larger and larger. But for quantum computers there is a form of skepticism which says that once your computer large enough, the description of quantum theory will fail in some way (See for example, Scott Aaronson’s “Are Quantum States Exponentially Long Vectors?”, quant-ph/0507242, for a nice discussion along these lines.) But wait, why shouldn’t we ask the question of whether classical evolution continues to hold for a computer as it gets bigger and bigger. Of course, we have never seen such a phenomenon as far as I know. But if I were really crazy, I would claim that the computer just isn’t big enough yet. And if I were even crazier I would suggest that once a classical computer gets to a certain size its method of computation changes drastically and allows for a totally different notion of computational complexity. And if I were flipping mad, I would suggest that we do have an example of such a system, and this system is our brain. Luckily I’m only crazy, not flipping mad, but still this line of reasoning is fun to pursue.
Talking in L.A. Talking in L.A. Nobody Talks in L.A.
Yesterday I gave a talk on quantum computing in Los Angeles at math-club. What is math-club? From their website (“be there or be square”):
People who like books and stories have movies, libraries, and even book clubs to amuse them. Those interested in math and logic have amazon.com and the occasional brain teaser at the back of in-flight magazines.
MATH-CLUB was formed to engage a group of LA’s math-inclined people in analytical discussions under an easy going atmosphere.
The club got rolling under suitably informal circumstances — its founder, Roni Brunn, brought up the idea for a math club at a bar and was lucky enough to do so within ear shot of a friend, Matt Warburton, who liked the concept and thought of others who would as well. Soon after that, we had our first math club meeting.
Stewart Burns lectured on a Sunday afternoon in September, 2002, to a crowd that included David X. Cohen, Kirsten Roeters, April Pesa, Ken Keeler, Daisy Gardner, Sean Gunn, James Eagan, and, of course, Matt, using most horizontal surfaces at Roni’s apartment as seating.
Because many of those most actively involved with MATH-CLUB do have writing credits at shows like “The Simpsons” and “Futurama,” some assume the club is only the province of professional Hollywood writers. In fact, the club as a whole is a diverse group of people who punch the clock as professors, computer programmers, musicians, actors, designers, journalists, and documentarians.
Similarly, people come to meetings with a wide range of math backgrounds. Some of the members have advanced degrees in math and have been published; some are recreational users of math. People do ask questions and explore the topic as a group. We kick off meetings with a cocktail party.
Who could resist talking to such a club? Certainly not me. Especially when I received an email from the organizer, Roni Brunn, which had as it’s subject “math and simpsons.” Such subject lines stimulate approximately 90 percent of my brain. If you include the word “ski” as well, then the extra 10 percent is activated and I go into epileptic seizures. Not a pretty sight.
I am especially fond of the math club analogy with book clubs (which were described to me by one of the people present last night as “mom’s drinking night.” Crap, now I’m going to get an email from my mother.) Why aren’t there more math clubs in the mold of book clubs: where small groups get together to hear about subjects which stretch their critical brains? I certainly think it’s a grand idea and am tempted to start a math club myself (but in Seattle we replace writers with programmers?)
When I was at Caltech I would often attend the talks in fields outside of physics/computer science. I called it “my television.” Certainly hearing a biology or geology talk and trying to discern what the heck they were talking about made me feel like a total moron. But whenever you hear a good talk outside of your field, you get a little bit of the feeling for what is going on, and this feels great. I personally wish that more graduate students would do this in order to help combat the graduate student fatigue which comes from being far to narrowly focused and not remembering why it is that all of science is interesting.
Anyway, it was a fun talk, and hopefully I didn’t punish people too much. When I was in Italy recently, I noticed that when I noticed that the students were not understanding the English I was using, I would try to fix this by reverting back to speaking in idioms, which are simpler, of course, but totally incomprehensible to the Italian students. I noticed last night that I sometimes do a similar thing when I give a talk: when I’m talking about something and I think people are lost, I oscillate back to a language which is far simpler but whose content doesn’t really help to discern my original meaning. What I need to learn to do is to ramp down to medium levels with content. Ah well, much to learn about this “giving talks” thing.
Is Geometry the Key?
Gilles Brassard has a very nice article in the Commentary section of Nature, “Is information the key?” From the abstract:
Quantum information science has brought us novel means of calculation and communication. But could its theorems hold the key to understanding the quantum world at its most profound level? Do the truly fundamental laws of nature concern — not waves and particles — but information?
The article is well worth reading.
The basic question asked in the paper is whether or not it is possible to derive quantum theory from basic rules about information plus a little bit more. Thus for instance, one can ask, whether it is possible to derive quantum theory by assuming things like no superluminal communication plus no bit commitment (two properties of quantum theory as we understand it today.) To date, there have been some very nice attempts to move this task forward. In particular Brassard mentions the work of Bub, Clifton and Halvorson which is very nice. However, my beef with all such derviations I’ve seen so far is that their assumptions are too strong. For example in the Bub et al work, they assume theory must be described within a C*-algebraic framework. And this assumption just hides too much for me: such assumptions basically are assumption of the linearity of the theory and don’t really shed light on why quantum theory should act in this manner. Linearity, for me, is basically the question “why amplitudes and not probabilities?” This, I find, is a central quantum mystery (well not a mystery, but something I’d like to see a reason given for in the same way that if I had been around in 1900, I would have wanted to see an explanation for the Lorentz transform, which is what Einstein, so beautifully, did.) On the other hand, the fact that one can make these assumptions and derive quantum theory or quantum-like behavior is extremely suggestive and I would be happy if lots of people started thinking about this question.
Personally, I’ve already gone through a stage where I thought this might be the approach to undestanding quantum theory and moved to another stage (just as I have, throughout my life, loved every single interprtation. I even have a strong memory of sitting in a car on a vacation with my family, probably when I was in high school, where I distinctly remember understanding the many-worlds interpretation of quantum theory! On the other hand, I also have a vivid memory of waking up one night while I was an undergrad at Caltech and having an absolutely impossible to disprove reason for why there is evil in the world. Strange feelings, those. “The moment of clarify faded like charity does.”) In work I did with Ben Toner, we showed a protocol for simulating the correlations produced by projective measurements on a singlet using shared randomness and a single bit of communication. For a long time, Ben and I wondered whether we could derive this protocol via more basic assumptions. For example, is there a simple game for which the protocol with one bit of communication is the best strategy (this game also being best solved by bare, unaided with communication, quantum theory?) Of course, one can always define a game such that these correlations and the protocol are the best, but that is cheating: we wanted a simple game to go along with our simple protocol. Alas we could never find such a game or a more basic set of principles from which to derive our protocol. As a funny note, we called the game we were searching for “Warcraft.” And when we were looking for a game which the optimal strategy would yeild all of quantum theory we called it simply “The Game.” What is “The Game” at which quantum theory is the optimal strategy?
After working on “The Game” and related ideas, I’ve become less convinced that this is the proper approach to take. Why? Well mostly due to the structure of the protocol we came up with for simulating the singlet quantum correlations. The important observation, I now believe, about this protocol is its geometric nature. If you look at the protocol (quant-ph/0304076) what is interesting about it, to me, is the beautiful geometry of the protocol. I should mention that recently an reformulation of our protocol has appeared quant-ph/0507120 by Degorre, Laplante, and Roland which is very nice and also demonstrates how simple the geometry involved in the protocol is. So why do I focus on this geometric aspect? Well because I think that the ultimate explanation for why quantum theory is the way it is must, in some way provide answers to the question of hidden variables in quantum theory (prejudice number 1) and that the only way in which I know to get around such obstacles is to muck with the topology of spacetime (prejudice number 2), and thus an attempt to understand our protocol in terms of changes in the topology of spacetime is the proper route to take. However, I haven’t yet succeeded in recasting our protocol in these terms. Perhaps some crazy wonderkid out there can easily see how to do this! (Alternatively I wouldn’t be surprised if my prejudice number two is somehow replaced with particularly strange reasoning about the nature of time. Here I am thinking somewhat of the transactional interpretations of quantum theory.)
Anyway, understanding why quantum theory is the way it is, is either one of the greatest mysteries of physics, or a dead end that will never lead to any experiments. I hope for the former, but, can’t quite convince myself that it can’t be the latter.
An New Optimized Blog
Via Michael Nielson’s blog, I’ve learned that Scott Aaronson has strated a blog: Shtetl-Optimized. Since Scott is essentially crazy (that’s a complement, peoples) I’m really looking forward to his posts. Plus he said he will talk about the Simpsons, and heck, who doesn’t like talking about the Simpsons?
Shor Broke Our Toys
Rod Van Meter has an interest post up today on his blog titled Stop the Myth: QKD doesn’t fix what Shor broke. Along similar lines, Rod points to the paper quant-ph/0406147 by Kenneth G. Paterson, Fred Piper, and Ruediger Schack, “Why Quantum Cryptography?”
Here is my take on what Rod and quant-ph/0406147 are arguing (but I’d recommend reading what they say, because my expertise on quantum cryptography goes to zero in the limit of the people who know what they are talking about going to some number greater than four.) Mostly I’ve just cobbled this together from what Rod said and my own personal misunderstandings. So you should read Rod’s post first, and then laugh at my silly rehashing of the main points.
To simplify life let’s break secure communication into two steps. The first is authentication and the second is key exchange. In most applications what we need to do is first authenticate our channel and then second we need to do key exchange using this authenticated channel. Now Shor’s algorithm broke the most popular public key cryptosystems (those based on the difficulty of factoring and discrete log.) These public key cryptosystems were often used for both authentification and for key exchange. Thus Shor broke both of these when operating with those cryptosystems.
OK, now what the quantum cryptography (QKD) do? Well what it does is, given an authenticated channel, it allows you to create secret keys whose security is based on quantum theory being a correct description of nature. But how does one establish this authentification? Because Shor broke the most widely used public key cryptosystems, we need to establish this with some other technique. One way to do this is for the parties to share some small secret keys which they can then use to establish the authenticated channel. So the point Rod and others are making is that QKD doesn’t provide for an authentication scheme whose security is based on the quantum theory. Rod thinks that physicists haven’t known about this or realized it’s importance, which may be true as I can’t read other physicists brains or listen to all they, but I certainly think everyone working in quantum cryptography realizes this (and even some people who don’t like me 😉 ) QKD is thus best viewed not as a way to providing a total solution to unconditional security but as a way to take a small secret key and use it to create a huge shared secret key. The security of this will be based on (1) the security of the authentication scheme and (2) the laws of quantum theory. Maybe to put this into simple words, I would say that QKD fixes only part of what Shor broke.
Another important point is that in the above paragraph I’m talking about using QKD as a way to distribute keys at a sufficiently high rate to use these keys as one time pads for communication. Right now, many of the QKD systems do not opperate at speeds fast enough to achieve this goal for reasonable communciation rates. Certainly getting to these high speeds is one of the grand challenges in experimental QKD! Currently, I am told, that many QKD systems sold use the keys they generate as part of another layer of cryptography (like 3DES or AEA). This does provide security, but weakens the security (one could conveivably break these other systems…one would need to do this for the size of the key the QKD creates: so as QKD rates go up and up, conceivably one should be able to make this security higher and higher until eventually you will just want to use the key as a pad.)
But, like I said, my understanding of this is murky, so go read Rod’s blog and check out quant-ph/0406147! Oh, and for all you lurking quantum cryptologists, I’d love to hear your take on all of this.
Got Security?
John Preskill points me to Workshop on Classical and Quantum Information Security to be held December 15-18, 2005 at Caltech:
This workshop will bring together researchers from a variety of backgrounds who work on different aspects of classical and quantum information security. Participants will strive to identify issues and problems of common interest that can be effectively addressed by pooling their expertise.
Looks like a very broad spectrum of researchers will be attending, which should make for a very interesting workshop. Overcoming communication barriers is often the first step in interesting new research directions, and I’m sure one hope of the workshop is to aid this difficult process.
Which reminds me of a documentry I saw a few years ago which tried to identify why certain schools were much better at producing (not buying 😉 ) Nobel prizes. What the documentry argued was that the common theme for the top Nobel producing institutes was that they all had very loose definitions of what was considered “in a particular discipline.” Or, the way I like to think about it: the energy landscape on the space of different research programs had very shallow minimums at the top Nobel producing institutes, whereas at most other schools, these wells were very deep…sometimes being even infinite square wells! This is, of course, an oversimplification, but I’m glad to see that Caltech (highest Nobel prize count per member of the community that I know of ) is hosting a workshop which so well imbodies this spirit.
Angel Dust, Ozone, Wack, and Rocket Fuel
Course I’m sitting in on this term (will probably have to miss two lectures, drats!):
CSE 533: The PCP Theorem and Hardness of Approximation, Autumn 2005
Instructors:
Venkatesan Guruswami and Ryan O’Donnell
Meeting times: MW 10:30-11:50 at Loew Hall, Room 222
Course Description
The PCP Theorem states that there is a certain format for writing mathematical proofs, with the following property: a verifier can randomly spot-check only a *constant* number of bits of the proof and yet be convinced of its correctness or fallaciousness with extremely high probability. Further, proofs in this format need only be polynomially longer than they are in “classical” format, and can be produced in polynomial time from the classical proof.
The proof of this theorem in the early 1990’s was a great triumph of theoretical computer science, and it became even more important in the mid-1990’s when it was shown to be extremely powerful in proving NP-hardness results for *approximate* solutions to optimization problems.
Unfortunately, the known proof of the PCP theorem was extremely complex and also decentralized; even complete courses devoted to it rarely finished all of the proof details. However in April 2005, Irit Dinur completely changed the PCP landscape by giving a self-contained proof taking only a dozen or so pages. (See http://eccc.uni-trier.de/eccc-reports/2005/TR05-046/index.html)
In this course we will prove the PCP Theorem and also gives some of its consequences for hardness of approximation. Topics will be a mix of older work from the 1990’s, as well as very recent results on the cutting edge of research, such as:
Interactive proofs
The Parallel Repetition Theorem
Hardness of approximation for linear equations, clique size, maximum cut, set cover, and lattice problems
Fourier analysis of boolean functions
The Unique Games conjecture
Hopefully I will have time to blog about some of this, as it is probably some of the most beautiful stuff in computer science. Even if it does share a name with a drug.
QIP 2006
The Ninth Workshop on Quantum Information Processing, i.e. QIP 2006, has its announcement out. Although, strangely the three emails they sent me all crash pine! QIP will be in Paris this year. They’ve switched to making the process of selecting who will talk competitive (computer scientists, I say!) In addition to longer invited talks, there will be long (?!) 30 minute talks and short (I’ll say!) 10 minute talks from contributors. Unfortunately I won’t be able to make it this year due to other obligations. Drats, I’ve always wanted to spend time in Paris!
Subject: QIP’06 Call for Communications
This is the first announcement for QIP 2006, the Ninth Workshop on
Quantum Information Processing, to be held in Paris, France, from
Monday January 16th until Friday January 20th, 2006. You can find more
information at the conference website at http://www.lri.fr/qip06 .
The program of QIP 2006 will be organized somewhat differently from
the previous QIP workshops. Besides a reduced number of 45 minutes
invited talks there is a Call for Communications at
http://www.lri.fr/qip06/call.html for long (30 minutes) and short (10
minutes) contributed talks. Contributed talks will constitute the
major part of the scientific program.
We solicit your submissions to QIP 2006. Note that the deadline for
submissions is November 3, 2005.
During the workshop there will also be a poster session, a business
meeting, a conference dinner and additional social activities. There
will be a nominal conference fee.
See you in Paris!
The organizers
Miklos Santha, Christoph Durr, Julia Kempe, Sophie Laplante and Frederic Magniez
(http://www.lri.fr/quantum/)
Post Quantum Crytopgraphy
In the comments Who Will Program the Quantum Computers?, Hal points to PQCryto 2006, a conference on post-quantum cryptography. From their webpage:
PQCrypto 2006: International Workshop on Post-Quantum Cryptography
Will large quantum computers be built? If so, what will they do to the cryptographic landscape?
Anyone who can build a large quantum computer can break today’s most popular public-key cryptosystems: e.g., RSA, DSA, and ECDSA. But there are several other cryptosystems that are conjectured to resist quantum computers: e.g., the Diffie-Lamport-Merkle signature system, the NTRU encryption system, the McEliece encryption system, and the HFE signature system. Exactly which of these systems are secure? How efficient are they, in theory and in practice?
PQCrypto 2006, the International Workshop on Post-Quantum Cryptography, will look ahead to a possible future of quantum computers, and will begin preparing the cryptographic world for that future.
Prepare for us world, we quantum computers are coming (well, maybe not so fast 😉 )
Optimistic Deutsch
David Deutsch thinks quantum computing is just around the corner. He has posted the following on his blog:
For a long time my standard answer to the question ‘how long will it be before the first universal quantum computer is built?’ was ‘several decades at least’. In fact, I have been saying this for almost exactly two decades … and now I am pleased to report that recent theoretical advances have caused me to conclude that we are within sight of that goal. It may well be achieved within the next decade.
The main discovery that has made the difference is cluster quantum computation, which is a marvellous new way of structuring quantum computations which makes them far easier to implement physically.