CEPI Seminar 3/9/05 – Ben Schumacher

Ben Schumacher, quantum informationista extraordinare, and inventor of the word “qubit” will be givin the next Complexity, Entrophy, and the Physics of Information Distinguished Lecture at the Santa Fe Institute this Wednesday, 3/9/05, reception at 4:15, talk starts at 5:00:

Information Engines and the Second Law
Ben Schumacher
Department of Physics, Kenyon College
Maxwell’s demon, which extracts work from a thermodynamic system by acquiring information about it, has for more than a century been a favorite thought-experiment in the foundations of statistical physics. The demon has variously been viewed as a threat, an exception, an exemplar, and a means for extending the Second Law. I will describe a new formulation of thermodynamics in which such “information engines” play the central role, giving new insights about entropy, information erasure, the meaning of temperature, and the connection between fluctuation and dissipation.

Optimality Feels Good

New paper! New paper! Let’s all do the new paper dance. Posted to the arxiv today: quant-ph 0503047. This paper is a revision of an earlier paper, where now it is shown that the protocol in the earlier paper is in fact optimal.
Optimal classical-communication-assisted local model of n-qubit Greenberger-Horne-Zeilinger correlations
Authors: Tracey E. Tessier, Carlton M. Caves, Ivan H. Deutsch, Dave Bacon, Bryan Eastin
Comments: This submission supersedes quant-ph/0407133. It is a substantially revised version of the previous document and now includes an optimality proof of our model

We present a model, motivated by the criterion of reality put forward by Einstein, Podolsky, and Rosen and supplemented by classical communication, which correctly reproduces the quantum-mechanical predictions for measurements of all products of Pauli operators on an n-qubit GHZ state (or "cat state"). The n-2 bits employed by our model is shown to be optimal for the allowed set of measurements, demonstrating that the required communication overhead scales linearly with n. We formulate a connection between the generation of the local values utilized by our model and the stabilizer formalism, which leads us to conjecture that a generalization of this method will shed light on the content of the Gottesman-Knill theorem.

Day 9

It’s been far too long since my last ski trip. Yesterday I went skiing at the Santa Fe Ski Basin. It’s amazing what a small amout of new snow can do to make the conditions enjoyable. Steve Flammia came up with following interesting question: “How many chairs (what percentage) do you pass on your way from the top to the bottom of the chair lift?” [correction: As Joe points out, I somehow managed to totally mangle this question. Of course the question should be how many chairs you pass on the way from the bottom to the top of the chair. God Mondays are rough.] Make a first intuitive guess without thinking about the problem. I’d love to hear what people’s first guesses are in the comments.

Stop Blaming the Bugs

I just got through watching the movie “The Butterfly Effect.” (decent movie, I could probably form a religion from its basic plot.) The name of the movie comes from a statement you sometimes hear from those who work in chaos theory:

The flapping of a single butterfly’s wing today produces a tiny change in the state of the atmosphere. Over a period of time, what the atmosphere actually does diverges from what it would have done. So, in a month’s time, a tornado that would have devastated the Indonesian coast doesn’t happen. Or maybe one that wasn’t going to happen, does. (Ian Stewart, “Does God Play Dice? The Mathematics of Chaos”)

And I have to say that I’ve never understood what I’m supposed to take from this statement. Sure many systems are chaotic and we can have large differences in behavior from seemingly small changes in the initial conditions. But I sometimes get the feeling that a causal relationship is being made in this statement: if it weren’t for the butterfly, the people in the path of the tornado would be fine, i.e. the butterfly caused the tornado. But this clearly isn’t true. There are plenty of other effects which are also casually necessary for the tornado. Do I get to blame the butterfly if an even smaller change in the wavefunction of single proton somewhere in the upper atmosphere changes the initial conditions by even a smaller amount than the butterfly and this in turn changes the entire outcome of whether there is a tornado? In fact, I would argue that we can only blame the butterfly if other changes in initial conditions of comparable size do not change the outcome of whether there is a tornado or not. Chaos may be ubiquitous, but I wish we’d all stop blaming the butterfly.

Probability of Greatness

An amusing anecdote from cond-mat 0305150 by Simikin and Roychowdhury:

During the “Manhattan project” (the making of nuclear bomb[sic]), Fermi asked Gen. Groves, the head of the project, what is the definition of a “great” general. Groves replied that any general who had won five battles in a row might safely be called great. Fermi then asked how many generals were great. Groves said about three out of every hundred. Fermi conjectured that considering that opposing forces for most battles are rougly equal in strength, the chance of winning one battle is 1/2 and the chance of winning five battles in a row is (1/2)^5=1/32. “So your right General, about three out of every hundred. Mathematical probability, not genius.”

What I Do

The life of a theorist (“Good Benito” by Alan Lightman, highly recommended):

He stands up from the boxes and looks out the window. To the east, in the distance, rises the steeple of a chapel, fragile and faint. The light changes. A cloud drifts over the sun. Then the sun is uncovered again, the little room fills up with light.
He lets down the blinds but keeps the slats open. Strips of light slide from the wall to the floor. He returns to his boxes, unpacks. A set of keys. A faded photograph of a young woman with auburn hair. Two old letters from John. These last things he puts carefully in a drawer. Most of the boxes are books. He stack them against the wall, the muscles flexing in his arms. The room darkens as another cloud passes over the sun, lightens, darkens again.
Now he lies on the upholstered couch in the corner. He beings writing. He writes on a white pad of paper, wavy lines and strange signs, mathematical symbols. He closes his eyes for a while, begins writing again. Someone knocks on the door, but he doesn’t hear. He imagines corrugated surfaces, magnified again and again. He calculates and imagines, while the room glows and dims and the sun slides slowly across the floor.

The best days of a theorist are lonely periods of intense concentration mixed with a sort of day dreaming creativity. And it’s one of the reasons I find it nearly impossible to complain about what I do.

Breeding Books

In today’s age of online scientific publishing, it’s hard to remember the days when one would have to trudge to the library to do significant research. Even harder to think about are the days before the printing press, when books were translated by hand. In those old days, knowledge moved slowly and truly such work must have been a labor of love (two good words, scrivener: “a professional or public copyist or writer” and scriptorium: “a copying room for the scribes in a medieval monastery.” Scrivener, of course, is probably most famously known from the short story Bartleby the Scrivener by Herman Melville.) And when we think about it a bit, we realize that the resemblence between copying books and the other labor of love, having children, is more than just superficial. Indeed, in order for a book to be born, a previous copy must exist. Similarly books are destroyed over a given of years. A little more sophisticated model suggests that the birth rate for books will not be constant but will decrease as the number of books saturates “the market.” Thus we can map the growth and survival rates of books as a function of time.
In this months Science (307, p. 1305-1307 , 2005) John Cisne proposes just such a model for the survival of books during the Middle Ages. What Cisne finds is that indeed the age distributions of books surviving today predicted by a simple population dynamic model indeed appear to be correct. So here are some cool numbers:

…manuscripts were about 15 to 30 times more likely to be copied as to be destroyed and had a half-life of four to nine centuries, and that population’s doubling time was on the order of two to three decades.

I wonder if one of the reasons why science didn’t advance as much during the Middle Ages was that this long doubling time (two to three decades) ment that it was very unlikely that the fruits of your hard work producing a book would not be disseminated during your lifetime. Think if you could work on science but that the implications of your work wouldn’t ever be reveal until long after your death. The invention of the printing press sure was a marvelous event, wasn’t it?

Theory Like Software

One interesting issue in quantum information science is the lack of hiring of theory people by physics departments in the United States over the last few years (by my count, four hires in the last six years.) I think one of the main reasons for why this may be true is that physics departments are still skeptical over the field of quantum computing. For experimentalists, the hiring situation has been better (although, of course, no piece of cake by any measure!) The reason for this, it seems to me, is that experimental people in quantum information science are doing interesting experiments which push the boundaries of experimental physics. It is easier to justify hiring an experimentalist because there is some belief that they will be able to adjust if quantum information begins to fizzle out (I don’t believe it, but apparently a large number of physicists do.) In essence physics departments feel that they experimentalists are a better hedge than theorists.
But lets take a look at this issue from the perspective of the next few years. As quantum computers grow from four and five qubit computers to ten to a hunderd qubit computers, the experimentalists will be in some ways less tied to fundamental physics and more tied to the engineering and technology of the devices they will be building. And there is a another important factor: not all implementations of quantum computers are going to pay off. Thus while hiring an experimentalists who is working for the implementation which really takes off can be a jackpot for a department, hiring one who is working on the implementation which fizzles can leave the department in exactly the position they are supposedly avoiding by not hiring theorists.
Now look at this from the perspective of hiring a theorist. Quantum information theorists are much more immune to which implementation really takes off. Sure, some theorists are more tied to a particular implementation, but, on the other hand the main bulk of theory is done in a way which is independent of any quantum computing platform. Thus quantum information theorists, like those involved in the computer science of software, are in many ways a more robust hire in the long run than an experimentalist.
Of course, this argument is only a small part of the big picture (i.e. what does happen if the field is a fad? what if you do believe you can pick out the best implementation? what if you only care about hiring someone who will have an impact in the next five to ten years?) but it’s certainly an argument which I wish more physics departments would listen to.

Second Hand News

You know blogs are taking off when you’re in the airport listening to the audio of one of the 24 hour news stations and you listen as they report not on the news directly, as one might expect from their name, but on “what the blogs are reporting.” Ugh.

From Outer Space

Back. Washigton was…sunny. Bet that was the last adjective you expected. Here is a copy of the talk I gave in the computer science and engineering department at the University of Washington (Warning, it’s 17 megs).