Just what every physicists has dreamt of: a watch with a Geiger counter in it! Of course, you can tell it is for physicists as well by the lack of style!
Science Fiction Clouds My Judgement
Strange stars:
Stars race around a black hole at the center of the Andromeda galaxy so fast that they could go the distance from Earth to the Moon in six minutes.
The finding, announced today, solves a mystery over the source of strange blue light coming from Andromeda’s center. But it generates a new puzzle: The stars’ phenomenal orbital velocity suggests they should never have formed in the first place.
Astronomers first spotted the blue light near Andromeda’s core in 1995. Three years later, another group determined that the light emanated from a cluster of hot, young stars. Nobody knew how many were involved.
New data from the Hubble Space Telescope reveal more than 400 blue stars that formed in a burst of activity roughly 200 million years ago, astronomers said.
The stars are packed into a disk that is just 1 light-year across.
That’s amazingly compact by cosmic standards. A light-year is the distance light travels in a year, about 6 trillion miles (10 trillion kilometers). The nearest star to our Sun is about 4.3 light-years away.
Unlikely setup
“The blue stars in the disk are so short-lived that it is unlikely in the long 12-billion-year history of Andromeda that such a short-lived disk would appear now,” said Tod Lauer of the National Optical Astronomy Observatory in Tucson, Arizona. “We think that the mechanism that formed this disk of stars probably formed other stellar disks in the past and will trigger them again in the future. We still don’t know, however, how such a disk could form in the first place. It still remains an enigma.”
The problem with science fiction is that whenever you read articles like this, and you see some strange configuration of stars in a cool locale, you immediately think…”Aliens!”
Quantum Soccer
Wee Kang Chua here in Singapore has pointed me to Quantum Soccer. If you want to decrease your productivity by a good ten, twenty percent, please click on the link.
Genius Grants
The list of this years MacArthur award winners is out. The MacArthur awards are half-million dollar no strings attached awards for “creativity, orginality, and potential.” I will brag that Berkeley ended up with three awards this year. OK, that’s enough bragging. At times like this, I think it is important to revisit this classical article from the Onion:
MacArthur Genius Grant Goes Right Up Recipient’s Nose
October 15, 2003 | Issue 39•40
ALBANY, NY—According to friends, the $500,000, five-year, no-strings-attached MacArthur Fellowship awarded to Jim Yong Kim earlier this month went right up the 43-year-old scientist’s nose. “Kim’s efforts to eradicate drug-resistant strains of tuberculosis in Russian prisons and Peruvian ghettos amazed everyone—as did his appetite for top-grade cocaine,” Marisa Amir said Monday. “As soon as that first check arrived, Kim was on the phone with his dealer, and two hours later, he was in a hot tub full of strippers.” His first installment of money gone, the scientist then returned to the task of developing a whole-cell cholera toxin recombinant B subunit vaccine.
If it really is unrestricted, I wonder if you can take the money and put it all on the number 13 in roulette?
Post Quantum Crytopgraphy
In the comments Who Will Program the Quantum Computers?, Hal points to PQCryto 2006, a conference on post-quantum cryptography. From their webpage:
PQCrypto 2006: International Workshop on Post-Quantum Cryptography
Will large quantum computers be built? If so, what will they do to the cryptographic landscape?
Anyone who can build a large quantum computer can break today’s most popular public-key cryptosystems: e.g., RSA, DSA, and ECDSA. But there are several other cryptosystems that are conjectured to resist quantum computers: e.g., the Diffie-Lamport-Merkle signature system, the NTRU encryption system, the McEliece encryption system, and the HFE signature system. Exactly which of these systems are secure? How efficient are they, in theory and in practice?
PQCrypto 2006, the International Workshop on Post-Quantum Cryptography, will look ahead to a possible future of quantum computers, and will begin preparing the cryptographic world for that future.
Prepare for us world, we quantum computers are coming (well, maybe not so fast 😉 )
Life Around Black Holes
I just started reading A Fire Upon the Deep by Vernor Vinge. Interestingly in Vinge’s universe there is a stratum for the laws of physics in the universe. In particular in the galaxy proper, the speed of light is finite, but as one gets farther away from the galaxy this changes. The futher one gets from the galaxy, the more amazing technology which one can build and operate.
Which got me thinking about Scott Aaronson’s paper NP-complete Problems and Physical Reality. In this paper, one issue Scott discusses something which you will hear over many coffee breaks at quantum computing conferences: can one use relativity to create exponential speedups in computational power. One way you might think of doing this involves using time dialation. Set your computer up to calculate some hard problem. Then board a spaceship and head off for a leisurely trip around the galaxy at speeds nearing the speed of light. When you return to your computer, via the twin paradox the computer will be much older than you, and will, hopefully have solved the problem. Roughly if your speed is [tex]$beta=v/c$[/tex], then you can get a speedup for your computation which is proportional to [tex]$(1-beta^2)^{-frac{1}{2}}$[/tex]. The problem with this scheme, it appears, is that in order to work, you need to get your [tex]$beta$[/tex] exponentially close to the speed of light, and this would require an exponential amount of energy. So it doesn’t seem that such a scheme would work. Another proposal, along similar lines, is to set up your computer, and then travel very close to a black hole (not recommended, only trained professionals should attempt this.) Then due to the gravitational time dilation, you can mimic the twin experiment and return to a (hopefully) solved problem. However, again, it seems that to get yourself out of the computational well requires an amount of energy which will destroy the effect.
But what Vinges’s novel got me thinking about was the following: there appears to be a computational advantage to being away from masses. Assume that there is some form of life surrounding a black hole (OK, big assumption, but hey!) Then it seems to me that this computational advantage for an intelligent species might contribute to a competetive advantage in the evolutionary sense. Thus we might expect that a civilization for which gravitational time dilation is a real effect will live in a world, much like Vinge’s world, where lesser intelligent animals live close to the mass and the more intelligent, more magic wielding creatures live farther away (“Any sufficiently advanced technology is indistinguishable from magic”-Arthur C. Clarke.) Following Vinge’s novel, one might even speculate about the comutational advantage of being outside the galaxy. The time dilation effect there is about one part in one million (as opposed to one part in a one billion for the effect from being at the surface of the earth versus “at infinity.”) Unfortunately this seems like to small of a number to justify any such effect.
OK, enough wild speculation for a Tuesday.
Best Title Ever Submission: Pancakes
Steve submits the following for best paper title ever, astro-ph/0509039: “Local Pancake Defeats Axis of Evil” by Chris Vale. I wonder if Chris has informed North Korea, Iraq, and Iran that they have been defeated by a local pancake (is it better to be defeated by a nonlocal pancake?)
Actually this paper is about a very interesting subject. One can learn all sorts of interesting things about the cosmological history of our universe from the angular spectrum of the cosmic background radiation. I remember as a graduate student, when I thought I might go into astrophysics, taking a class in which we calculated this spectrum for all sorts of different cosmological models. The bumps in the spectrum across different spherical harmonics were distinctly different for many different models. Well now, thanks to experiments like WMAP, we have exceedingly good information about this spectrum (the music of the univerese, in poetic language.) This allows us to very nicely rule out all sorts of models about the early history of our universe. Interestingly, however, there is a possible unexplained feature in the spectrum. This is that the l=2 and l=3 components appear to be correlated. One explanation of this effect is simply that this is a statistical fluke. This explanation is commonly refered to as the “axis of evil” theory! The “local pancake” in the title of the paper refers to the authors theory about this l=2,l=3 anomoly: he postulates that it is the effect of gravitational lensing due to the structure of mass in our local neighborhood (bet you didn’t know we lived in a pancake did you?) This lensing, the author claims, will have a consequence for the l=1 (dipole) term. But why does this change the l=2, l=3 components? Because the l=1 dipole term is usually subtracted out from the data because it has large components due to our proper motion with respect to the cosmic background radiation. The author claims that the lensing effect causes this l=1 dipole term to be subtracted incorrectly. Any good astrophysicists out there slumming on a quantum blog care to comment?
Delta X Delta P
The science blogosphere is abuzz about Lisa Randall’s op-ed article in the New York Times. See comments at Hogg’s Universe, Not Even Wrong, Lubos Motl’s Reference Frame, and Cosmic Variance. The article just made me happy: read the following paragraph
“The uncertainty principle” is another frequently abused term. It is sometimes interpreted as a limitation on observers and their ability to make measurements. But it is not about intrinsic limitations on any one particular measurement; it is about the inability to precisely measure particular pairs of quantities simultaneously. The first interpretation is perhaps more engaging from a philosophical or political perspective. It’s just not what the science is about.
There is nothing that makes my Monday mornings brighter than a correct popular explanation of the uncertainty principle.
Optimistic Deutsch
David Deutsch thinks quantum computing is just around the corner. He has posted the following on his blog:
For a long time my standard answer to the question ‘how long will it be before the first universal quantum computer is built?’ was ‘several decades at least’. In fact, I have been saying this for almost exactly two decades … and now I am pleased to report that recent theoretical advances have caused me to conclude that we are within sight of that goal. It may well be achieved within the next decade.
The main discovery that has made the difference is cluster quantum computation, which is a marvellous new way of structuring quantum computations which makes them far easier to implement physically.
H-index Me
Many of you have probably already seen this. Jorge Hirsch, a physicist from UCSD, has proposed an interesting way to measure research impact of an author. For details, see this Nature article or Hirsch’s original article physics/0508025. The basic idea of Hirsh’s h-index is very simple. The index is simply the number of papers which the author has written which have more citations than this number of papers. Thus, for instance, if an author had written five papers with the following number of citations, 10, 6,4,2, and 1, then the h-index would be three because the forth most cited paper has only two citations, which is less than four, but the third most cited paper has four citations which is greater than three. The highest h-index among physicists, Hirsch claims, is Ed Witten who has an h-index of 110. This means he has written 110 papers with greater than 110 citations. Wow! Another important quantity Hirsch defines is the average rate at which an h-index has been changing per year over a career. This is just a person current h-index divided by the time since they first started publishing. Witten has an astounding value of 3.9 increase in h-index on average per year over his career. What this all means is very much open to debate, but heck, it’s kind of fun!
One thing which is nice about the h-index is it is very simple to calculate it using the ISI Web of Science citation tools or, more dangerously, from citebase. My h-index from citebase (access to Web of Science is painful from my current computer location) is 12. The funny thing is that Hirsch says that this is about the h index (with large error bars) at which one should get tenure. Haha, very funny!