What I Do

The life of a theorist (“Good Benito” by Alan Lightman, highly recommended):

He stands up from the boxes and looks out the window. To the east, in the distance, rises the steeple of a chapel, fragile and faint. The light changes. A cloud drifts over the sun. Then the sun is uncovered again, the little room fills up with light.
He lets down the blinds but keeps the slats open. Strips of light slide from the wall to the floor. He returns to his boxes, unpacks. A set of keys. A faded photograph of a young woman with auburn hair. Two old letters from John. These last things he puts carefully in a drawer. Most of the boxes are books. He stack them against the wall, the muscles flexing in his arms. The room darkens as another cloud passes over the sun, lightens, darkens again.
Now he lies on the upholstered couch in the corner. He beings writing. He writes on a white pad of paper, wavy lines and strange signs, mathematical symbols. He closes his eyes for a while, begins writing again. Someone knocks on the door, but he doesn’t hear. He imagines corrugated surfaces, magnified again and again. He calculates and imagines, while the room glows and dims and the sun slides slowly across the floor.

The best days of a theorist are lonely periods of intense concentration mixed with a sort of day dreaming creativity. And it’s one of the reasons I find it nearly impossible to complain about what I do.

Breeding Books

In today’s age of online scientific publishing, it’s hard to remember the days when one would have to trudge to the library to do significant research. Even harder to think about are the days before the printing press, when books were translated by hand. In those old days, knowledge moved slowly and truly such work must have been a labor of love (two good words, scrivener: “a professional or public copyist or writer” and scriptorium: “a copying room for the scribes in a medieval monastery.” Scrivener, of course, is probably most famously known from the short story Bartleby the Scrivener by Herman Melville.) And when we think about it a bit, we realize that the resemblence between copying books and the other labor of love, having children, is more than just superficial. Indeed, in order for a book to be born, a previous copy must exist. Similarly books are destroyed over a given of years. A little more sophisticated model suggests that the birth rate for books will not be constant but will decrease as the number of books saturates “the market.” Thus we can map the growth and survival rates of books as a function of time.
In this months Science (307, p. 1305-1307 , 2005) John Cisne proposes just such a model for the survival of books during the Middle Ages. What Cisne finds is that indeed the age distributions of books surviving today predicted by a simple population dynamic model indeed appear to be correct. So here are some cool numbers:

…manuscripts were about 15 to 30 times more likely to be copied as to be destroyed and had a half-life of four to nine centuries, and that population’s doubling time was on the order of two to three decades.

I wonder if one of the reasons why science didn’t advance as much during the Middle Ages was that this long doubling time (two to three decades) ment that it was very unlikely that the fruits of your hard work producing a book would not be disseminated during your lifetime. Think if you could work on science but that the implications of your work wouldn’t ever be reveal until long after your death. The invention of the printing press sure was a marvelous event, wasn’t it?

Theory Like Software

One interesting issue in quantum information science is the lack of hiring of theory people by physics departments in the United States over the last few years (by my count, four hires in the last six years.) I think one of the main reasons for why this may be true is that physics departments are still skeptical over the field of quantum computing. For experimentalists, the hiring situation has been better (although, of course, no piece of cake by any measure!) The reason for this, it seems to me, is that experimental people in quantum information science are doing interesting experiments which push the boundaries of experimental physics. It is easier to justify hiring an experimentalist because there is some belief that they will be able to adjust if quantum information begins to fizzle out (I don’t believe it, but apparently a large number of physicists do.) In essence physics departments feel that they experimentalists are a better hedge than theorists.
But lets take a look at this issue from the perspective of the next few years. As quantum computers grow from four and five qubit computers to ten to a hunderd qubit computers, the experimentalists will be in some ways less tied to fundamental physics and more tied to the engineering and technology of the devices they will be building. And there is a another important factor: not all implementations of quantum computers are going to pay off. Thus while hiring an experimentalists who is working for the implementation which really takes off can be a jackpot for a department, hiring one who is working on the implementation which fizzles can leave the department in exactly the position they are supposedly avoiding by not hiring theorists.
Now look at this from the perspective of hiring a theorist. Quantum information theorists are much more immune to which implementation really takes off. Sure, some theorists are more tied to a particular implementation, but, on the other hand the main bulk of theory is done in a way which is independent of any quantum computing platform. Thus quantum information theorists, like those involved in the computer science of software, are in many ways a more robust hire in the long run than an experimentalist.
Of course, this argument is only a small part of the big picture (i.e. what does happen if the field is a fad? what if you do believe you can pick out the best implementation? what if you only care about hiring someone who will have an impact in the next five to ten years?) but it’s certainly an argument which I wish more physics departments would listen to.

Second Hand News

You know blogs are taking off when you’re in the airport listening to the audio of one of the 24 hour news stations and you listen as they report not on the news directly, as one might expect from their name, but on “what the blogs are reporting.” Ugh.

From Outer Space

Back. Washigton was…sunny. Bet that was the last adjective you expected. Here is a copy of the talk I gave in the computer science and engineering department at the University of Washington (Warning, it’s 17 megs).

"Physics"

For a while I’ve joked that at the rate storage rates are increasing for hard drives, it will soon be possible that instead of having an MP3 player with all you favorite songs on it, you will simply have a device with “music.” All of music.
Now I learn, via Michael Nielsen’s blog, that Joanna Karczmarek is starting a project to put the entire arXiv.org into bittorrent files. Currently she is offering the hep-th section from all of 2004 via a bit torrent. So, I guess, coming soon to a laptop near you: “physics.” I wonder if this will be the impetus for me to get a new laptop with a monsterous hard drive.

Best Abstract Ever

Ken Brown sends me a nomination for the “best abstract ever”:

Malberg and O’Neil, PRL 39,1333 (1977), “Pure Electron Plasma, Liquid, and Crystal”
Abstract: We speculate on the possibility of liquefying and crystallizing a magnetically confined pure electron plasma.

Fifty Dimensional Computers

Sometimes there are conflicts that run deeply in side of me because of my original training as a physicist. One thing I have never understood in classical computer science is why the locality and three dimensionality of our world don’t come into play in the theory of computational complexity. I mean sure, I can understand how you would like to divorce the study of the complexity of algorithms from the underlying medium. Sure I can also understand that those who study architectures spend copious amounts of time dealing with exactly how resources scale due to the constraints of locality and dimensionality. But isn’t it true that a true theory of the complexity of information processing should, at it’s most fundamental level, make reference to the dimensionality and connective of the space in which the computer is built? Perhaps the complexity of cellular automata gets some way towards this goal, but somehow it doesn’t feel like it goes all of the way. Most importantly the usual conditions of uniformity of the cellular automata seem to me to be overly restrictive of a theory of computational complexity which doesn’t ignore issues of dimensionality and locality.
Another interesting spinoff of this line of reasoning is to think about how computation changes when the topology of spacetime is not trivial and even when the topology of spacetime is itself something which can change with time. What is the power of a computer which can change the very notion of the topology of the underlying circuit connectivity?

SQuInT

For those of you who don’t know, SQuInT stands for “Southwest Quantum Information and Technology” and has been having a conference in the Southwest united states for seven years (and even longer if you count some “pre”-SQuInTs.) SQuInT is becoming more and more unique in that it is one of the rare conferences which tries to bring together the different parts of physics which are all involved in trying to build a quantum computer. They even have a few talks by silly theorists like me. My talk wasn’t as good as I had hoped. 30 minutes is pretty darn stringent.
The highlight of the conference, besides the night spent watching the old couples work the dance floor at the “exclusive” resort in Tuscon where the conference was held, was to hear about the work of Robert Schoelkopf from Yale on combining cavity quantum electrodynamics with superconducting qubits. Traditional cavity QED is done with cavities coupling to neutral atoms (in either a microwave or optical regime.) Some of the earliest quantum computing implementations were performed in cavity QED by Jeff Kimble’s lab at Caltech. What Robert talked about was using a cavity to couple to a hybrid superconducting qubit. He showed some really nice results demonstrating the vacuum Rabi oscillations from his coupling of the cavity to his superconducting qubit. An amazing aspect of this system is that the effective dipole moment of the superconducting qubit is about ten thousand times stronger than in neutral atoms. Why is this important for quantum computing? It’s probably most important because one of the most difficult tasks for many solid state quantum computing systems is the ability to perform readout of the state of the qubit with a high reliability and without destroying the system. Robert’s scheme shows a reasonable chance of performing such a task. For those of you who wish to bet with me on what the final quantum computer will look like, the SQuInT conference and Robert’s results in particular have made me recalculate my odds. Please send me an email if you would like to place a bet ; )

Been Around the World and I, I, I…

Yep, I’m still alive. Back from the SQuINt conference in Tuscon, Arizona. Back in Santa Fe for exactly 7 hours. Ugh. Hopefully I will have a chance to post during a layover during my travel to Washington tomorrow.