The Most Interesting Quantum Foundations Result You've Never Heard Of

When I was an undergraduate at Caltech visiting Harvard for the summer I stumbled upon Volume 21 of the International Journal of Theoretical Physics (1982). What was special about this volume of this journal was that it was dedicated to papers on the subject of physics and computation (I believe it was associated with the PhysComp conference?) Now for as long as I can remember I have been interested in physics and computers. Indeed one of the first programs I ever wrote was a gravitational simulator on my TRS-80 Color Computer (my first attempt failed because I didn’t know trig and ended up doing a small angle approximation for resolving vectors…strange orbits those.) Anyway, back to Volume 21. It contained a huge number of papers that I found totally and amazingly interesting. Among my favorites was the plenary talk by Feynman in which he discusses “Simulating Physics with Computers.” This paper is a classic where Feynman discusses the question of whether quantum systems can be probabilistically simulated by a classcal computer. The talk includes a discussion of Bell’s theorem without a single reference to John Bell, Feynman chastizing a questioner for misusing the word “quantizing”, and finally Feynman stating one of my favorite Feynman quotes

The program that Fredkin is always pushing, about trying to find a computer simulation of physics, seem to me an excellent program to follow out. He and I have had wonderful, intense, and interminable arguments, and my argument is always that the real use of it would be with quantum mechanics, and therefore full attention and acceptance of the quantum mechanical phenomena-the challenge of explaining quantum mechanical phenomena-has to be put into the argument, and therefore these phenomena have to be understood ver well in analyzing the situation. And I’m not happy with all the analyses that go with just the classical theory, because nature isn’t classical dammit, and if you want to make a simulation of nature, you’d better make it quantum mechanical, and by golly it’s a wonderful problem, becuase it doesn’t look so easy.

(That, by the way, is how he ends the paper. Talk about a way to finish!)
Another paper I found fascinating in the volume was a paper by Marvin Minsky in which he points out how cellular automata can give rise to relativistic and quantum like effects. In retrospect I dont see as much amazing about this paper, but it was refreshing to see things we regard as purely physics emerging from simple computational models.
But the final paper, and which to this day I will go back and read, was “The Computer and the Universe” by John Wheeler. Of course this being a Wheeler paper, the paper was something of a poetic romp…but remember I was a literature major so I just ate that style up! But the most important thing I found in that paper was a description by Wheeler of the doctoral thesis of Wootters. Wootters result, is I think, one of the most interesting result in the foundations of quantum theory that you’ve never heard of (unless you’ve read one of the versions of the computation and physics treatise that Wheeler has published.) Further it is one of those results which is hard to find in the literature.
So what is this result that I speak of? What Wootters considers is the following setup. Suppose a transmitter has a machine with a dial which can point in any direction in a plane. I.e. the transmitter has a dial which is an angle between zero and three hundred and sixty degrees. Now this transmitter flips a switch and off goes something…we don’t know what…but at the other end of the line, a receiver sits with another device. This device does one simple thing: it receives that something from the transmitter and then either does or does not turn on a red light. In other words this other device is a measurment aparatus which has two measurment outcomes. Now of course those of you who know quantum theory will recognize the experiment I just described, but you be quiet, I don’t want to hear from you…I want to think, more generally, about this experimental setup.
So we have transmitter with an angle and a reciever with a yes/no measurement. Now yes/no measurements are interesting. Suppose that you do one hundred yes/no measurements and find that yes occurs thirty times. You will conclude that the probability of the yes outcome is then roughly thirty percent. But probabilities are finicky and with one hundred yes/no measurements you can’t be certain that the probability is thirty percent. It could very well be twenty five percent or thirty two percent. Now take this observation and apply it to the setup we have above. Suppose that the transmitter really really wants to tell you the angle he has his device set up at. But the receiver is only getting yes/no measurements. What probability of yes/no measurements should this setup have, such that the the receiver gains the most information about the angle being sent? Or expressed another way, suppose that a large, but finite, number of different angles are being set on the transmitter. If for each of these angles we get to choose a probability distribution, then this probability distribution will have some ability to distinguish from other probability distributions. Suppose that we want to maximizes the number of distinguishable settings for the transmitter. What probability distribution should occur (i.e. what probability of yes should there be as a function of the angle)?
And the answer? The answer is [tex]$p_{yes}(theta)=cos^2left({n theta over 2} right)$[/tex] where [tex]$n$[/tex] is an integer and [tex]$theta[/tex] is the angle. Look familiar? Yep thats the quantum mechanical expression for a setup where you send a spin $n/2$ particle with its amplitude in a plane, and then you measure along one of the directions in that plane. In other words quantum theory, in this formulation, is set up so that the yes/no distribution maximizes the amount of information we learn about the angle [tex]$theta$[/tex]! Amazing!
You can find all of this in Wootters’s 1980 thesis. A copy of which I first laid my hands on because Patrick Hayden had a copy, and which I subsequently lost, but now have on loan for the next two weeks! Now, of course, there are caviots about all of this and you should read Wootters’s thesis, which I do highly recommend. But what an interesting result. Why haven’t we all heard of it?

25 Replies to “The Most Interesting Quantum Foundations Result You've Never Heard Of”

  1. Wootters was one of Wheeler’s students, wasn’t he?
    His result has a distinctly MaxEnt/Bayesian flavor, although the precise connection, if indeed it can be made, is not evident to me.

  2. Trivial correction: His surname is “Wootters,” with the “s” being an integral part of the name. The possessive would be either “Wootters’s” or “Wootters'” (I’d go with the former), but not “Wootter’s”…
    (He was also my StatMech prof as an undergrad…)

  3. Dave said, “But what an interesting result. Why haven’t we all heard of it”
    Perhaps because not all of us take quantum foundations as seriously as we should.
    Chris W. said, “His result has a distinctly MaxEnt/Bayesian flavor, although the precise connection, if indeed it can be made, is not evident to me.”
    Philip Goyal’s recent work on an axiomatic derivation of quantum theory can be viewed as an attempt to beef up results of the sort that Wootters found in his Ph.D. thesis. It’s not exactly the same argument, but the connection to MaxEnt is made explicit. Unfortunately, I can’t find links to any of his papers (he needs to learn to use the arXiv).

  4. The specific connection that I had in mind may be found in gr-qc/0508108, and maybe elsewhere, although I can’t recall where at the moment. A teaser:

    Then we introduce the main assumption: there is an intrinsic fuzziness to space which is revealed by an irreducible uncertainty in the location of the test particles. Thus, to each point in space we associate a probability distribution. The overall state of space – the macrostate – is defined by the product of the distributions associated to the individual points. The geometry of space is the geometry of all the distances between test particles and this geometry is of statistical origin [10]. Identical particles that are close together are easy to confuse, those that are far apart are easy to distinguish. The distance between two neighboring particles is the distinguishability distance between the corresponding probability distributions which is given by the Fisher-Rao metric [11]. A remarkable feature of this choice of distance is its uniqueness: the Fisher-Rao metric is the only metric that takes account of the fact that we deal with probability distributions and not with “structureless” points [12]. A second remarkable feature is that the information geometry we introduce does not define the full Riemannian geometry of space but only its conformal geometry. This appears at first to be a threat to the whole program but it turns to be just what we need in a theory of gravity [13].

    (Many of the author’s other papers will be of interest in the context of this post.)

  5. Philip Goyal, Ariel Caticha, and Chris Fuchs all presented papers in the morning “Foundations” session at MaxEnt 2005.

  6. Okay, that is just plain strange quant-ph/0701181 seems to be basically Wootters’s argument. Who needs pseudo-telepathy when the real one seems to work just fine 😉

  7. Hey, do I get a prize if I did know about this result? 🙂 My respect for Bill can be inferred from his membership in a very select group: people whose collected papers I’ve methodically downloaded in their entirety when possible. Unfortunately, much of his early work was published in Found. Phys., whose back issues are not available online. This, frankly, is a tragedy.
    Ten years ago, I had an interesting series of conversations with Ben Schumacher, about what might be called the Elegance Principle. Physical theories are convincing and plausible inasmuch as they are elegant — where elegance is a totally undefinable and subjective concept that’s vaguely related to Kolmogorov complexity. Ben’s position, i.i.r.c., was that GR is a lot more elegant than QM — and therefore a better candidate for being “true” in a platonic sense.
    This is one of the few scientific points on which I’ve managed to disagree with Ben for any length of time. I suspect that QM is a much more elegant theory than we often give it credit for — it manages to self-consistently generalize classical probability theory, which is a nontrivial task. Bill’s result is one of the earliest pieces of evidence for this elegance. Some of the most intriguing recent work in the same tradition (to me) includes Lucien Hardy’s axioms, and Scott Aaronson’s argument that changing QM tends to give you ridiculously powerful (PP-complete) computers.

  8. At Caltech there used to be a service where you could search for articles by an author and then ask for the library to print out all of those articles. I once did this for Wootters and printed out all of his articles. How could I resist with such titles as “Quantum Mechanics Without Time” (or something like that.)

  9. Can we change the past?
    From Art: The news media is starting to cover retrocausality.
    This is already what I have been writing about all week and the picture from this article is already in
    http://qedcorp.com/APS/Dec122006.ppt
    updated this evening including the separate UR “stencil” optical buffering experiment that I think may be adaptable to the Cramer experiment below to get more complex wave patterns than “two slit” imaged via entanglement.
    The headline is not correct. The events are linked in a global consistent loop in time. Nothing is “changed in the past” – the past happened the way it happened because of a future cause. Free will is suspended in that kind of retro-causation.
    I modify Cramer’s experiment in the following way. Use an arbitrary stencil for the future delayed choice encoding and use a simple diffraction grating for the past receiver. Forget the double slits. Also we use Born probability axiom, what we do assume is that the pair state is non-metrical.
    Now I am not sure if this will work because it strongly violates the conventional wisdom of The Pundits in “signal locality” proofs like the “no cloning a quantum” theorem based upon axioms of:
    1. Born probability assumption (A. Valentini’s “sub-quantal equilibrium)
    2. Unitarity
    3. Linearity superposition of quantum amplitudes
    of orthodox micro-quantum theory.
    Given an arbitrary superposition qubit
    |1) = |+)(+|1) + |-)(-|1)
    A clone machine C would make
    |2) = C|1) = |1)|1) = (|+)(+|1) + |-)(-|1))(|+)(+|1) + |-)(-|1))
    But, if this C machine is a linear operator then
    C|1) = C( |+)(+|1) + |-)(-|1)) = |+)|+)(+|1) + |-)|-)(-|1) =/= |2)
    That’s one issue. Another issue is given a maximally entangled pair state
    |ab) = |a+)|b+)(++|ab) + |a-)|b-)(–|ab)
    Assume that the encoder is a LOCAL UNITARY LINEAR OPERATOR U(b) acting only on b.
    U(b)|ab) = |a+)U(b)|b+)(++|ab) + |a-)U(b)|b-)(–|ab)
    The probability to detect a in the + state is
    P(a+|U) = (ab|U(b)*|a+)(a+|U(b)|ab)
    = ((ab|++)(a+|(b+|U(b)* + (ab|–)(a-|(b-|U(b)*)|a+)(a+||a+)U(b)|b+)(++|ab) + |a-)U(b)|b-)(–|ab))
    = (ab|++)(++ab)(b+|U(b)*U(b)|b+)
    using (a+|a-) = 0
    (a+|a+) = 1
    But unitarity is
    U(b)*U(b) = 1
    Therefore
    P(a+|U) = P(a+)
    so that any local unitary change at b will not be seen or “imaged” at a.
    This is the standard “proof” for signal locality, but it is not convincing because it is circular assuming what it seeks to prove in the axiom that U is already a local operator (e.g. Peacock & Hepburn)
    http://qedcorp.com/APS/in_laser_1.jpg
    Cramer’s experiment above with entangled pairs
    http://qedcorp.com/APS/URstencil.jpg
    University of Rochester experiment not with entangled pairs
    but with an optical buffer.
    However the optical buffer uses the UR stencil above that is simply a more complex “double slit” (also a stencil) in the first stage (a) before the light pulse is slowed down in the buffer – the image is seen in (b) the third stage on exiting the buffer (delay line).
    Now, suppose we have the situation above in Cramer’s experiment with momentum correlations for photons a & b at x and x’ respectively, assuming initially equal probabilities for computational simplicity and assuming the entanglement correlation is persists non-metrically or globally topologically in time as well as space (as, so far, shown experimentally)
    |a,b) ~ Integral over k of |+ka)|-kb)
    The delayed choice filter stencil (UR) at x’ modulates the k-pattern of sender photon b
    (UR)|a,b) = Integral over k of(UR) |+ka)|-kb) = Integral over k of Sk(b) |+ka)|-kb)
    Where {Sk(b)} is the set of signal modulation coefficients that is the “content” of the message encoded at b
    Using a diffraction grating, the “wave” measurement is at any time t for receiver twin photon a is
    Pa(k) = (a,b|(UR)|ka)(ak|(UR)|a,b) = Integral over k’Integral over k”Sk'(b)* (+k’a|(-k’b| |ka)(ka|Sk”(b)|k”a)|-k”b)
    = |Sk(b)|^2
    Note, if we use the stencil in delayed choice, we do not use a double slit for the receiver, we use a diffraction grating.
    On Jan 26, 2007, at 11:17 PM, ANTIGRAYcs.com wrote:
    Science hopes to change events that have already occurred
    Patrick Barry
    Sunday, January 21, 2007
    Ever wish you could reach back in time and change the past? Maybe you’d like to take back an unfortunate voice mail message, or rephrase what you just said to your boss. Or perhaps you’ve even dreamed of tweaking the outcome of yesterday’s lottery to make yourself the winner.
    Common sense tells us that influencing the past is impossible — what’s done is done, right? Even if it were possible, think of the mind-bending paradoxes it would create. While tinkering with the past, you might change the circumstances by which your parents met, derailing the key event that led to your birth.
    Such are the perils of retrocausality, the idea that the present can affect the past, and the future can affect the present. Strange as it sounds, retrocausality is perfectly permissible within the known laws of nature. It has been debated for decades, mostly in the realm of philosophy and quantum physics. Trouble is, nobody has done the experiment to show it happens in the real world, so the door remains wide open for a demonstration.
    It might even happen soon. Researchers are on the verge of experiments that will finally hold retrocausality’s feet to the fire by attempting to send a signal to the past. What’s more, they need not invoke black holes, wormholes, extra dimensions or other exotic implements of time travel. It should all be doable with the help of a state-of-the-art optics workbench and the bizarre yet familiar tricks of quantum particles. If retrocausality is confirmed — and that is a huge if — it would overturn our most cherished notions about the nature of cause and effect and how the universe works.
    Dating back to Newton’s laws of motion, the equations of physics are generally “time symmetric” — they work as well for processes running backward through time as forward. The situation got really strange in the early 20th century when Einstein devised his theory of relativity, with its four-dimensional fabric of space-time. In this model, our sense that history is unfolding is an illusion: The past, present and future all exist seamlessly in an unchanging “block” universe.
    “If you have the block universe view, the future and the past are not any different, so there’s no reason why you can’t have causes from the future just as you have causes from the past,” says David Miller of the Centre for Time at the University of Sydney in Australia.
    With the advent of quantum mechanics in the 1920s, the relative timing of particles and events became even less relevant. “Real temporal order in general, for quantum mechanics, is not important,” says Caslav Brukner, a physicist at the University of Vienna, Austria. By the 1940s, researchers were exploring the possibility of time-reversed phenomena. Richard Feynman lent credibility to the idea by proposing that particles such as positrons, the antimatter equivalent of electrons, are simply normal particles traveling backward in time. Feynman later expanded this idea with his mentor, John Wheeler of Princeton University. Together they worked out a theory of electrodynamics based on waves traveling forward and backward in time. Any proof of reverse causality, however, remained elusive.
    Fast forward to 1978, when Wheeler proposed a variation on the classic double-slit experiment of quantum mechanics. Send photons through a barrier with two slits in it, and choose whether to detect the photons as waves or particles. If you put up a screen behind the slits, you will get a pattern of light and dark bands, as if each photon travels through both slits and interferes with itself, like a wave. If, on the other hand, you take a snapshot of the slits themselves, you will find each photon passes through one slit or the other: it is forced to pick a path, like a particle. But, Wheeler asked, what if you wait until just after the photon has passed the slits to make your choice? In theory, you could suddenly raise the screen to expose two cameras behind it, one trained on each slit. It would seem that you can affect where the photon went, and whether it behaved like a wave or particle, after the fact.
    In 1986, Carroll Alley at the University of Maryland at College Park, found a way to test this idea using a more practical set-up: an interferometer which lets a photon take either one path or two after passing through a beam splitter. Sure enough, the photon’s path depended on a choice made after the photon had to “make up its mind.” Other groups have confirmed similar results, and at first blush this appears to show the present affecting the past. Most physicists, however, take the view that you can’t say which path the photon took before the measurement is made. In other words, still no unambiguous evidence for retrocausality.
    That’s where John Cramer comes in. In the mid-1980s, working at the University of Washington in Seattle, he proposed the “transactional interpretation” of quantum mechanics, one of many attempts to relate the mathematics of quantum theory to the real world. It says particles interact by sending and receiving physical waves that travel forward and backward through time. In June, at a conference of the American Association for the Advancement of Science, Cramer proposed an experiment that can at last test for this sort of retrocausal influence. It combines the wave-particle effects of double slits with other mysterious quantum properties in an all-out effort to send signals to the past.
    The experiment builds on work done in the late 1990s in Anton Zeilinger’s lab, when he was at the University of Innsbruck, Austria. Researcher Birgit Dopfer found that photons that were “entangled”, or linked by their properties such as momentum, showed the same wave-or-particle behavior as one another. Using a crystal, Dopfer converted one laser beam into two so that photons in one beam were entangled with those in the other, and each pair was matched up by a circuit known as a coincidence detector. One beam passed through a double slit to a photon detector, while the other passed through a lens to a movable detector, which could sense a photon in two different positions.
    The movable detector is key, because in one position it effectively images the slits and measures each photon as a particle, while in the other it captures only a wave-like interference pattern. Dopfer showed that measuring a photon as a wave or a particle forced its twin in the other beam to be measured in the same way.
    To use this setup to send a signal, it needs to work without a coincidence circuit. Inspired by Raymond Jensen at Notre Dame University, Cramer then proposed passing each beam through a double slit, not only to give the experimenter the choice of measuring photons as waves or particles, but also to help track photon pairs. The double slits should filter out most unentangled photons and either block or let pass both members of an entangled pair, at least in theory. So a photon arriving at one detector should have its twin appear at the other. As before, the way you measure one should affect the other. Jensen suggested that such a setup might let you send a signal from one detector to another instantaneously — a highly controversial claim, since it would seem to demonstrate faster-than-light travel.
    If you can do that, Cramer says, why not push it to be better-than-instantaneous, and try to make the signal arrive before it was sent? His extra twist is to run the photons you choose how to measure through several kilometers of coiled-up fiber-optic cable, thereby delaying them by microseconds. This delay means that the other beam will arrive at its detector before you make your choice. However, since the rules of quantum mechanics are indifferent to the timing of measurements, the state of the other beam should correspond to how you choose to measure the delayed beam. The effect of your choice can be seen, in principle, before you have even made it.
    That’s the idea anyway. What will the experimenters actually see? Cramer says they could control the movable detector so that it alternates between measuring wave-like and particle-like behavior over time. They could compare that to the pattern from the beam that wasn’t delayed and was recorded on a sensor from a digital camera. If this consistently shifts between an interference pattern and a smooth singleparticle pattern a few microseconds before the respective choice is made on the delayed photons, that would support the concept of retrocausality. If not, it would be back to the drawing board.
    If the experiment does show evidence for retrocausation, it would open the door to some troubling paradoxes. If you could see the effects of your choice before you make it, could you then make the opposite choice and subvert the laws of nature? Some researchers have suggested retrocausality can occur only in limited circumstances in which not enough information is available for you to contradict the results of an experiment.
    Another way to resolve this is to say that even if the present can influence the past, it cannot change it. The fact that your hair is shorter today has as much influence on your going to the barber yesterday as the other way around, yet you can’t change that decision. “You wouldn’t be able to talk about altering, but you could talk about causing or affecting,” says Phil Dowe, an expert on causation at the University of Queensland in Australia. While it would mean we cannot change the past, it also implies that we cannot change the future.
    If all that gives you a headache, then consider this: if retrocausality does exist, it says something profound about how the universe works. “It has the potential to solve what is one of the biggest problems in modern physics,” says Huw Price, head of Sydney’s Centre for Time. It goes back to quantum entanglement and “nonlocality” — one particle instantaneously affecting another, even from the other side of the galaxy. That doesn’t sit well with relativity, which states that nothing can travel faster than light. Still, the latest experiments confirm that one particle can indeed instantaneously affect the other. Physicists argue that no information is transmitted this way: Whether the spin of a particle is up or down, for instance, is random and can’t be controlled, and thus relativity is not violated.
    Retrocausality offers an alternative explanation. Measuring one entangled particle could send a wave backward through time to the moment at which the pair was created. The signal would not need to move faster than light; it could simply retrace the first particle’s path through space-time, arriving back at the spot where the two particles were emitted. There, the wave can interact with the second particle without violating relativity. “Retrocausation is a nice, simple, classical explanation for all this,” Dowe says.
    While Cramer last week prepared to start a series of experiments leading up to the big test of retrocausality, some researchers expect reverse causality will play an increasingly important role in our understanding of the universe. “I’m going with my gut here,” says Avshalom Elitzur, a physicist and philosopher at Bar-Ilan University in Israel, “but I believe that when we finally find the theory we’re all looking for, a theory that unifies quantum mechanics and relativity, it will involve retrocausality.”
    But if it also involves winning yesterday’s lottery, Cramer won’t be telling.
    Did we reach back to shape the Big Bang? If retrocausality is real, it might even explain why life exists in the universe — exactly why the universe is so “finely tuned” for human habitation. Some physicists search for deeper laws to explain this fine-tuning, while others say there are millions of universes, each with different laws, so one universe could quite easily have the right laws by chance and, of course, that’s the one we’re in.
    Paul Davies, a theoretical physicist at the Australian Centre for Astrobiology at Macquarie University in Sydney, suggests another possibility: The universe might actually be able to fine-tune itself. If you assume the laws of physics do not reside outside the physical universe, but rather are part of it, they can only be as precise as can be calculated from the total information content of the universe. The universe’s information content is limited by its size, so just after the Big Bang, while the universe was still infinitesimally small, there may have been wiggle room, or imprecision, in the laws of nature.
    And room for retrocausality. If it exists, the presence of conscious observers later in history could exert an influence on those first moments, shaping the laws of physics to be favorable for life. This may seem circular: Life exists to make the universe suitable for life. If causality works both forward and backward, however, consistency between the past and the future is all that matters. “It offends our common-sense view of the world, but there’s nothing to prevent causal influences from going both ways in time,” Davies says. “If the conditions necessary for life are somehow written into the universe at the Big Bang, there must be some sort of two-way link.”
    — Patrick Barry
    Retrocausality: Can the present affect the past?
    Researchers have devised an experiment using laser light to demonstrate a property of quantum mechanics: That pairs of entangled photons show identical properties as either a wave or a particle. By using this knowledge, they hope to demonstrate how to influence an event that has already occurred.
    1. A laser beam is directed into a crystal that makes two streams of photons.
    2a. One stream of photons travels through a screen with two slits.
    2b. The other stream of photons travels through an identical screen with two slits BUT is routed through six miles of fiber-optic cable that delays the light by microseconds.
    3a. A detector captures the light and records it as a wave-like or particle-like photon (you don’t know which yet).
    3b. The delayed light is sensed by a movable detector. If the detector is closer to the lens it’s recorded as a wave-like interference pattern. If its farther from the lens it is recorded as a particle.
    What is happening here: By choosing to measure the delayed photon as either a wave or particle photon, the experimenter forces the other photon to appear in the same way – because they are entangled – even though it reaches the detector earlier.
    Sources: John Cramer, University of Washington; NewScientist, Sept. 2006
    Patrick Barry wrote this piece for the New Scientist, where it first appeared. Contact us at insightsfchronicle.com.
    Jack Sarfatti
    sarfattipacbell.net
    “If we knew what it was we were doing, it would not be called research, would it?”
    – Albert Einstein
    http://www.authorhouse.com/BookStore/ItemDetail.aspx?bookid=23999
    http://lifeboat.com/ex/bios.jack.sarfatti
    http://qedcorp.com/APS/Dec122006.ppt
    http://video.google.com/videoplay?docid=-1310681739984181006&q=Sarfatti+Causation&hl=en
    http://www.flickr.com/photos/lub/sets/72157594439814784

  10. I would like to gently advance the notion that if a notable physicist says that a problem is intractable, then it is most usually true that the ideas needed to solve the problem have already been published, but are not yet recognizable as being applicable to the problem in question. In this regard, Feynman’s article Simulating Physics With Computers is particularly notable for the scope of its review—because Feynman includes no references at all to the literature!
        This makes the Feynman’s 1982 article very pleasant to read (which I think was his main intent) but with hindsight, we can see that numerous key articles, and more importantly, numerous key ideas, were already in-print in 1982 that today are greatly expanding the scope of (what we would today call) P-time and P-space quantum simulation.
        For example, we can point to the algebraic theorems of Choi and Stinespring, as connected to physics by Davies, Kraus, Mensky, and Sudarshan, as connected to geometry by Kahler and Chern, as connected to engineering model order reduction by Stoker, Nickell, Remseth, and Morris, who in turn drew on the 1934 work of Dirac and Frenkel, all of which is intimately connected to information theory and algorithmic compression by the work of Shannon.
        Whew! As Feynman himself said in his Nobel Prize lecture (1966): “We are struck by the very large number of physical viewpoints and widely different mathematical formalisms that are all equivalent to one another.”
        Nowadays, AFAICT, what all this literature is boiling down are these three key principles: I. Only a quantum computer can efficiently simulate another quantum computer. II. Conversely, all non-computational quantum processes can be classically simulated in P-time and P-space III. “Simulated in principle” increasingly means “simulated in practice”, and this is where quantum science intersects with quantum engineering.
        Whether this quantum algorithmic progress represents “a nightmare scenario for many physicists” (as Adrian Cho recently called it) versus a broad-spectrum advance in science and technology obviously depends in great measure on one’s own personal goals and point of view. I would be very interested to hear people’s perspectives on this important yet highly personal topic.

  11. Robin, I would argue that elegance = symmetry. Elegance could be argued, in some sense, to be another way of expressing beauty and beauty has been linked to symmetry (see studies by Thornhill [U. New Mexico] and Grammer [U. Vienna]). So, in that sense I would say that elegance is definable. Marquardt (ok, he’s a retired plastic surgeon, but…) has found mathematical links to physical beauty and it is clearly a symmetric phenomenon. On the other hand, I’m not overly familiar with the nuances of Kolmogorov complexity so I have no idea how this might relate (suppose I’d have to think about it).

  12. “a nightmare scenario for many physicists”
    Can point me (us) to anything online that elaborates on Adrian Cho’s point of view?

  13. John, I don’t believe II! Indeed the exciting thing is that we don’t know how far recent insights into simulating quantum systems efficiently on a classical computer go (although we do know that they can only go so far, unless some crazy computational complexity class collapses happen.)

  14. I posted this yesterday, but it disappeared into the “ether”. So, here’s a second try.
          With regard to “II. Conversely, all non-computational quantum processes can be classically simulated in P-time and P-space,” well, for an engineer that’s a pretty good working definition of “non-computational”! Whether this definition could be made useful in proving formal theorems, I dunno. E.g., there might well be quantum processes for which it is formally undecidable whether they are computational, which would mean AFAICT that they are! 🙂
          As for Adrian Cho’s post, my BibTeX entry follows. In overview, there have been more than 100,000 articles published on high-temperature superconductivity, with no consensus having emerged as to the underlying physical mechanism. The “greatest nightmare” for many physicists, according to Adrian, is that the high-temperature superconductivity embodies a mixture of physical mechanisms so rich that only large-scale computer simulations can “explain” it.
          There is ample precedent for this in chess, where the desktop program Rybka easily beats all other chess programs and all human grandmasters (Rybka presently plays at Elo 3000+). Older grandmasters grumble at Rybka’s dominance, but younger grandmasters have adapted well. Grandmasters of all ages universally use Rybka, and other programs, to evaluate new moves, and so the quality of human chess has never been higher. And yet, the computers do make moves that humans cannot understand, e.g., computers can now play perfectly move sequences in the endgame that are hundreds of ply deep. The ascending power of quantum simulations is IMHO beginning to exert similar effects in physics and (especially) engineering.
    —–
    @article{Cho:06, author = {A. Cho}, title = {High T${}_{text{c}}$: the mystery that defies solution}, journal = {Science}, year = 2006, volume = 314, pages = {1072==5}, jasnote = {Adrian Cho: “According to some estimates, they [scientists] have published more than 100,000 papers on the materials.” “The challenge then is to explain how electrons that fiercely repel each other manage to pair anyway. Some researchers argue that waves of magnetism play a similar role to the one phonons play in conventional superconductors. Others focus solely on how the electrons shuffle past one another in a quantum-mechanical game of chess. Still others say that patterns of charge or current, or even phonons, play a crucial role. Pairing might even require all of these things in combination, which would be many physicists’ nightmare scenario.”}, }

  15. Just to mention, folks who (like me) are interested in the interface between simulation and human cognition might enjoy this lively account of the recent Freestyle Chess championship, in which the competitors are “centaurs”, meaning human + computer partnership teams. In Freestyle Chess, when it comes to seeking chess advice, as John Cleese says in the immortal film Rat Race, “the only rule is, there are no rules.” 🙂

Leave a Reply

Your email address will not be published. Required fields are marked *