Quantum Missile Command

My new favorite quantum hyperbole:

Handed to generals, a quantum computer might transform an ordinary nation into an instant superpower. Dozens of incoming missiles could be tracked at once.

Ketchup-ing

Things that happened while I was off the edge of the Interwebs:

  • Cormac McCarthy (my office neighbor while I was at the Santa Fe Institute) won the Pulitzer for fiction for his novel The Road. Cormac is also (amazingly!) giving an interview on Oprah, which is almost as amazing as Pynchon appearing on the Simpsons.
  • A new branch campus for the University of Washington is moving forward. This is a compromise over a previous attempt by the city of Everett to establish an independent polytechnic, which they hypothetically called the “Washington Institute of Technology” (WIT…you have to have a sense of humor to teach there?) This will bring to four the number of posts I have under the category of WIT.
  • Quantum inteference in photosynthesis
  • Earth-like planet discovered only twenty light years away. (Probably) five times the size of as massive as the Earth. If we send off an expedition today, by the time we arrive we should have been able to evolve to survive in the five times higher gravity?
  • Papers scirate wants me to read while I was away: 0704.3432, 0704.2529, 0704.2575, 0704.2575, 0704.2241, and 0704.3142.
  • A horrible title for a Nature blurb, “Quantum cryptography is hacked,” about an experiment performed at MIT (Phys. Rev. A, 75, 042327.) Notice how an inacurate title leads to all sorts of bad follow ups. That’s almost egregious enough to induce a Rage Against Doofosity!

The NP-complete Dog and Pony Show Comes to Seattle

Scott’s coming to town:

CSE Colloquium, 04/12/2007 3:30 pm, EE-105
Talk: The Limits of Quantum Computers
Speaker: Scott Aaronson (University of Waterloo)
Abstract: In the popular imagination, quantum computers would be almost magical devices, able to “solve impossible problems in an instant” by trying exponentially many solutions in parallel. In this talk, I’ll describe four results in quantum computing theory that directly challenge this view.
First, I’ll show that any quantum algorithm to decide whether a function f:[n]->[n] is one-to-one or two-to-one needs to query the function at least n^{1/5} times. This provides strong evidence that collision-resistant hash functions, and hence secure electronic commerce, would still be possible in a world with quantum computers.
Second, I’ll show that in the “black-box” or “oracle” model that we know how to analyze, quantum computers could not solve NP-complete problems in polynomial time, even with the help of nonuniform “quantum advice states.”
Third, I’ll show that quantum computers need exponential time to find local optima — and surprisingly, that the ideas used to prove this result also yield new classical lower bounds for the same problem.
Finally, I’ll show how to do “pretty-good quantum state tomography” using a number of measurements that increases only linearly, not exponentially, with the number of qubits. This illustrates how one can sometimes turn the limitations of quantum computers on their head, and use them to develop new techniques for experimentalists.
No quantum computing background will be assumed.

Since no quantum computing background is assumed, I may even be able to follow this one!

PCP Slinging Journalists

So many times scientists are very harsh on science journalists. Thus it is refreshing to see this article in the MIT Technology Review by Jason Pontin on D-Wave’s recent Orion quantum computer. The article includes an interview with D-Wave’s Geordie Rose by Pontin which is rather refreshing mostly for the fact that Pontin pulls out the PCP theorem on Geordie (Gunslinging computer scientists come equiped with the PCP theorem strapped to their belts. Only recently were the schematics for how to build this gun made accessible to mere mortals like us physicists.) The article seems rather fair to me, airing both Geordie’s views as well as asking the hard questions, includes criticisms from Scott and Umesh Vazirani, as well as a more moderate view from Seth Llloyd. Whether you are satisfied with the answers Geordie gives, of course, is a different matter! For example, me I don’t think that the current setup they have scales if they continue with the same setup they have here (energy gap goes like one over problem size, I might buy that thermalizing helps a bit, but am extremely skeptical when the energy from the environment will swamp the energy spectrum), but this is different from believing that it’s not possible to modify the path they are taking to actually scale their system.
Update: Here is the New York Times version of the article, where, sadly the PCP theorem has been removed. Sad, I mean all the latte-sipping Volvo-driving lefties I know can speak elegantly in an instancet about the PCP theorem and approximation algorithms.

QuJoke

Okay, so I’ve heard qubits and qutrits, but until recently hadn’t heard ququarts. More googling revealed qupents. Okay, then after a little more research I found an old 1999 paper which, while not containing a six level quantum system, did contain,

In particular, our protocol transfers entanglement from a pair of quantum correlated
light fields to a pair of trapped atoms in a process of quantum state exchange, or qusex.

Ahem.

Superposition of Funded and Not Funded

DARPA was an early huge supporter of quantum computing, but the QuIST program ran it’s course and, so far, there hasn’t been any “equivalent” replacement program, as far as I’m aware (but what do I know!) Thus it was interesting to run across this article, which has an awesome title: “Schrodinger’s Contracts: US Explores Quantum Computing.” Somehow this makes me picture my brain in a superposition of solving and not solving research problems. Now if only I could do ampitude amplification on my brain.

Debug First?

After Scott confessed to still programming in BASIC, I had a good time recalling how I first learned to program. My first interaction with a computer and “programming” was through LOGO, that cute little program for programming a little turtle to draw graphics. Drawing cool and crazy pictures was fun! But I don’t really remember learning that as “programming” so much as it was learning geometry (I was in the second grade at the time) and I certainly recall sharing the computer (probably not very sharefully.) But the real way I learned to program was not through LOGO but through debuging!
In the (g)olden days of my halycon youth you could buy computer magazines which had in the back of the magazine lines of BASIC to enter into your computer in order to play a game. Now I was all about playing games, and when we obtained at TRS-80 Color Computer, my parents only bought one game cartridge, so in order to play more games we had to type them into the computer. A group of neighborhood kids would get together and do this. One person would read and the other would type and others which check to make sure what was being typed was correct. Even with these precautions, however, we would, invariably make a mistake. So when you made a mistake you had to go looking through the program to find out where we made an error. Eventually, at first, we’d just go through the whole program line by line. But as we got more sophisticated we began to understand how, when the game died while doing X it meant that the program was probably at Y when this happened and so we began to understand what all that funny stuff we were typing in was. Eventually we got very good at debugging the code we typed into the computer. Indeed, in a very real sense, I learned to debug before I learned to code! (Years later I can still recall the astonishment of a high school teacher who watched me debug a program in a fairly rapid time 🙂 )
So here is a question. Sure we could teach students to program using your favorite language FAVLANG, but what if it was possible for us to teach debuging before we taught programming. What if the structure of the introduction to programming was centered around finding and fixing bugs. I mean, after reading Dreaming in Code, where the book ends with thousands of bugs left unfixed, I can only imagine that good debugging skills are important. Maybe even as important as coding skills? Just a thought. But even if this is not the way to go, how could you teach debugging before you taught programming? Now that’s fun to think about.

"Dreaming in Code" by Scott Rosenberg

Many of you have probably read the classic The Soul of a New Machine by Tracy Kidder which chronicles the building of a minicomputer at Data General in the late seventies. Earlier this week the torrent finally let up (no it didn’t stop raining) and spring break descended so I made a trip to the bookstore. There I found a book proclaiming on the back jacket to be the first true successor to The Soul of a New Machine, Dreaming In Code by the cofounder of Salon.com, Scott Rosenberg. Dreaming in Code chronicles the trials and tribulations of the creation of Chandler, an “interpersonal information manager that adapts to your changing needs”, which, if you visit the website linked above, you will discover is still at version 0.7. The book is an interesting, quick, if somewhat depressing read, I must say. Having myself never been involved in software development beyond my own hacking around with little personal programs, I found the picture painted of how code is developed to be interesting. Indeed, I can see in my own methods for programming many of the traits which are described as impeding good software development in a team setting: the desire to do it all yourself, the desire to reinvent instead of reuse, the desire to overdesign, etc etc. If nothing else, this book is probably a great read for anyone on the path towards becoming a software delevoper: not really a warning so much as a case study of the trials of tribulations of dreaming in code.

PRINT *, 'RIP, John Backus'

John Backus, who led the team that invented FORTRAN, has passed away. In a testimony to the staying power of FORTRAN, when I was doing an undergraduate research project in astrophysics in 1995, most of the code I delt with was written in FORTRAN.