Three Toed Sloth (who has been attending the complex systems summer school in China which I was supposed to attend before my life turned upside down and I ran off to Seattle) has an interesting post on Landauer’s principle. Landauer’s principle is roughly the principle that erasing information in thermodynamics disipates an amount of entropy equal to Bolztman’s constant times the number of bits erased. Cosma points to two papers, Orly Shenker’s “Logic and Entropy”, and John Norton’s “Eaters of the Lotus”, which both claim problems with Landaur’s principle. On the bus home I had a chance to read both of these papers, and at least get an idea of what the arguments are. Actually both articles point towards the same problem.
Here is a simplistic approach to Landaur’s principles. Suppose you have a bit which has two values of a macroscopic property which we call 0 and 1. Also suppose that there are other degrees of freedom for this bit (like, say, the pressure of whatever is physically representing the bit). Now make up a phase space with one axis representing the 0 and 1 variables and another axis representing these degrees of freedom. Actually lets fix this extenral degree of freedom to be the pressure, just to make notation easier. Imagine now the process which causes erasure. Such a process will take 0 to 0, say, and 1 to 0. Now look at this processs in phase space. Remember that phase space volumes must be conserved. Examine now two phase space volumes. One corresponds to the bit being 0 and some range of the pressure. The other corresponds to the bit being 1 and this same range of pressures. In the erasure procedure, we take 1 to 0, but now, because phase space volume must be preserved, we necesarily must change the values of the extra degree of freedom (the pressure), because we can’t map the 1 plus range of pressures region to the 0 plus the same range of pressures because this latter bit of phase space is already used up. What this necesitates is an increase of entropy, which at its smallest can be k ln 2.
From my quick reading of these articles, their issue is not so much with this argument, per se, but with the interpretation of this argument (by which I mean they do not challenge the logical consistency of Laundauer and other’s formulations of the principle, but challenge instead the interpretation of the problem these authors claim to be solving.) In both articles we find the authors particularly concerned with how to treat the macroscopic variables corresponding to the bits 0 and 1. In particular they argue that implicit in the above type argument is that we should not treat these macroscopic variables as thermodynamic-physical magnitudes. The author of the first paper makes this explicilty clear by replacing the phase space picture I’ve presented above by two pictures, one in which the bit of information is 0 and one in which the bit of information is 1 and stating things like “A memory cell that – possibly unknown to us – started out in macrostate 0 will never be in macrostate 1″ (emphasis the authors.) The authors of the second article make a similar point, in particular pointing out that “the collection of cells carrying random data is being treated illicitly as a canonical ensemble.”
What do I make of all this? Well I’m certainly no expert. But it seems to me that these arguments center upon some very deep and interesting problems in the interpretation of thermodynamics, and also, I would like to argue, upon the fact that thermodynamics is not complete (this may even be as heretical as my statement that thermodynamics is not universal, perhaps it is even more heretical!) What do I mean by this? Consider, for example, one of our classical examples of memory, the two or greater dimensional ferromagnetic Ising model. In such a model we have a bunch of spins on a lattice with interactions between nearest neighbors which have lower energy when the spins are aligned. In the classical thermodynamics of this system, above a certain critical temperature, in thermodynamic equibrium, the total magnetization of this system is zero. Below this temperature, however, something funny happens. Two thermodyanmic equilibrium states appear, one with the magnetization pointing mostly in one direction and one with the magnetization point mostly in another direction. These are the two states into which we “store” information. But, when you think about what is going on here, this bifurcation into two equibria, you might wonder about the “completeness” of thermodynamics. Thermodynamics does not tell us which of these states is occupied, nor even that, say each are occupied with equal probability. Thermodynamics does not give us the answer to a very interesting question, what probability distribution for the bit of stored information!
And it’s exactly this question to which the argument about Landauer’s principle resolves. Suppose you decide that for the quantities, such as the total magnetic field, you treat these as two totally separate settings with totally different phase spaces which cannot be accessed at the same time. Then you are lead to the objections to Landauer’s principle sketched in the two papers. But now suppose that you take the point of view that thermodynamics should be completed in some way such that it takes into account these two macroscopic variables as real thermodynamic physical variables. How to do this? The point, I think many physicist would make, it seems, is that no matter how you do this, once you’ve got them into the phase space, the argument presented above will procedure a Landauer’s principle type argument. Certainly one way to do this is to assert that we don’t know which of the states the system is in (0 or 1), so we should assign these each equal probability, but the point is that whatever probability assumption you make, you end up with a similar argument. in terms of phase space volume. Notice also that really to make these volumes, the macroscopic variables should have some “spread”: i.e. what we call 0 and 1 are never precisely 0 and 1, but instead are some region around magnetization all pointing in one direction and some region around magnetization pointing in another direction.
I really like the objections raised in these articles. But I’m not convinced that either side has won this battle. One interesting thing which I note is that the argument against Laundauer’s principle treats the two macrostates 0 and 1 in a very “error-free” manner. That is to say they treat these variables are really digital values. But (one last heresy!) I’m inclided to believe that nothing is perfectly digital. The digital nature of information in the physical world is an amazingly good approximation for computers….but it does fail. If you were able to precisely measure the information stored on your hard drive, you would not find zeros and ones, but instead zeros plus some small fluctuation and ones plus some small fluctuations. Plus, if there is ever an outside environment which is influencing the variable you are measuring, then it is certainly true that eventually your information, in thermodynamics, will disappear (see my previous article on Toom’s rule for hints as to why this should be so.) So in that case, the claim that these two bit states should never be accessible to each other, clearly breaks down. So I’m a bit worried (1) about the arguments against Laundauer’s principle from the point of view that digital information is only an approximation, but also (2) about arguements for Laundauer’s principle and the fact that they might somehow depend on how one completes thermodynamics to talk about multiple eqiulibria.
Of course, there is also the question of how all this works for quantum systems. But then we’d have to get into what quantum thermodynamics means, and well, that’s a battle for another day!
Update: be sure to read Cris Moore’s take on these two papers in the comment section. One thing I didn’t talk about was the example Shenker used against Laundauer’s principle. This was mostly because I didn’t understand it well enough and reading Cris’s comments, I agree with him that this counterexample seems to have problems.
I have done my best to read Shenker’s and Norton’s articles. I have to say I’m less impressed with them than you are… but perhaps there is something I am failing to appreciate.
Norton says that the usual argument for Landauer’s principle “depends on the incorrect assumption that memory devices holding random data occupy a greater volume in phase
space and have greater entropy than when the devices have been reset to default data.” In other words, he objects to the idea that bits with “random” values correspond to an ensemble.
Of course, “erasing” a bit which has a fixed previous value, “random” or not, can be done reversibly, either by flipping it or leaving it alone. But what we mean by “erase” is a procedure which sets a bit to zero regardless of its previous value. By definition, this maps the entire ensemble of possible values to the value 0, and this compresses the phase space. It seems to me that his claim that there are two cases in which the bit’s previous value is 0 or 1, and that in each of these cases the entropy does not increase, ignores the fact that we want a *single* physical process to set the bit to 0 in either case.
Shenker’s proposed counterexample (a device which she claims erases with arbitrarily small dissipation) seems simply wrong to me. A peg starts out in one of two notches corresponding to 0 and 1, and is then moved to a “ready” notch. She says that dissipation occurs only from friction as we move it.
But we also have to stop its motion when it reaches the “ready” notch. If we attempt to stop its horizontal motion with a billiard-ball elastic collision instead of dissipating its kinetic energy, then we could reverse the path of this collision and learn its previous value. Therefore, it seems to me we have to dissipate its kinetic energy. If this kinetic energy is less than kT, then thermal fluctuations could “bounce” it out of the ready notch, or back along its path towards its previous value.
(Indeed, unless the notches correspond to potential wells of depth at least kT, thermal fluctuations could bounce it out of the 0 and 1 notches as well. This is similar to the argument that Smoluchowsky’s trapdoor has to require an energy of at least kT/2 to swing open, or thermal fluctuations will cause to bang open and shut randomly.)
So, it seems to me her device again requires dissipation of O(kT) energy; the ln 2 would come from looking in detail at the random walk thermal fluctuations would induce in the motion of the peg.
– Cris
I tend to agree with Cris, and would make the same argument, phrasing it slightly differently. Return to the claim (Norton, page 17) that the thermodynamic entropy of a collection of cells with random data has the same entropy of a collection of cells with reset data. It is certainly true that for each cell in the random set the molecule is located on one side or another, and thus has the thermodynamic entropy associated to a molecule in the corresponding amount of phase space. However, use of the word random is not superfluous, as it indicates that looking at any particular cell we could reasonably expect it to be on either side of the partition. Hence the larger amount of occupied phase space volume. This flatly contradicts the assertion, and the further claim that “Thermodynamic entropy is a property of the cell and its physical state; it is not affected by how we might imagine the cell to be grouped with other cells.” But, in the interests of full disclosure, I’m coming at this more from the point of view of subjective entropy, so perhaps “I’ve already said too much.”
Dave,
your statement about the Ising model
> Thermodynamics does not tell us which of these
> states is occupied, nor even that, say each are
> occupied with equal probability.
is misleading. On a finite Ising lattice (size N)
the expectation value of the magnetization is
exactly zero. The limit N -> infty does not change
that.
Wolfgang: I think this is the heart of the question. I agree that if you make assign Boltzman weight for all energy states, then you get total magnetization zero. But I maintain that this is a VERY troubling aspect of thermodynamics which is exactly why it has problems. If you start with a system which has all the spins aligned in one direction, in the infinite size limit (below the critical temperature) the probability of flipping to the all spins align in the other direction vanishes (assuming some concrete model of the relaxation of the Ising model.) This problem: that thermo says they have equal probabilities, but that in practice the history of the initial configuration matter such, leads me to claim that there is a basic incompleteness.
> in the infinite size limit (below the critical
> temperature) the probability of flipping to the
> all spins align in the other direction vanishes
The way you state it, the problem would be with ergodicity not thermodynamics; The ensemble
average would not be the average over long times.
> the probability of flipping to the all spins align
> in the other direction vanishes
The way you state it, the problem is not with thermodynamics but with ergodicity; The time average does not equal the ensemble average.
> Especially when it comes to storing information?
If you store information you will use an external
field H and your problem disappears.
> but the question is why does it retain this
> information
???
You turn H > 0 on for a brief time, the spins
flip accordingly and the lattice remains in this
state for a long while (but not forever) if the
temperatur is low enough (as you have pointed out)
and external pertubations (e.g. H) are small.
Nothing mysterious, or am I missing something?
> This isn’t my understand of how a magnetic storage
> device works.
I do not know enough about real world hard disks to understand whether the Ising model is a good description.
Wolfgang: Indeed. Should this trouble me?
Both limits (ensemble average and time average) are to me, but approximations to what I’d do in the real world. But is there are good fundamental reason to worry more about ensemble averages? Especially when it comes to storing information?
Uh, that I don’t understand. If I can apply an external magnetic field to set the bit of information, but the question is why does it retain this information, even after the the field is shut off. This isn’t my understand of how a magnetic storage device works.
No, I don’t think you’re misssing anything! Certainly it’s my ignorance on display here!
I guess the question I have, and I think this is the question that goes to the heart of the discussion about Landaur’s principle, is how to treat this long life time. If we take straightaway thermodynamics, then, of course, we just say, we haven’t waited long enough for thermodynamic equilbrium to be reached. But in this setting what does “storing information” really mean? It seems that this is a concept “outside” of thermodynamics. But in such a setting, couldn’t we just as well argue that thermodynamic equilbrium is a “bad” concept? And since hard drives are (well old ones (basically)..new ones have real cool things going on with them but the effective idea is the same.) Ising models, the question is, if we are going to take these devices, acting as storage devices, seriously, then it seems that thermodynamic reasoning breaks down. That is, if we put a long time cutoff, then certainly thermodynamics doesn’t have anything to say about the two highly stable states. I certainly feel like making this critique is valid. It is not a critique of thermodynamcis, but more of the fact that for the purposes of talking about information storage in long lived states, thermodynamics isn’t perhaps the full language we need. This is why I argue that thermodynamics is incomplete. But I should clarify and say that I think it is incomplete when discussing long lived macrostates.
“disipates an amount of entropy equal to Bolztman’s constant times natural log of the number of bits erased”
Actually it would be: Boltzman’s constant times natural log of the number of possible states of the system. If we are talking of a bit, for example, the formula would be: klog(2^ 1 bit). If we have two bits, it would be klog4, and so.
If I’m wrong please write to me , I’m very interested about this topic.
Ooops. Thanks rarruec, fixed that. How did that get by everyone reading this post?!?