Let’s multiply together a bunch of numbers which are less than one and see how small they get!
If that sounds like fun, then you’ll love this sleek new infographic (of which the above is just the teaser) posted this morning at BoingBoing. The graphic is based on this blog post by Dr. Ali Binazir, who apparently has an AB (same as a BA) from Harvard, an MD from the UC San Diego School of Medicine, and an M.Phil. from Cambridge.
I’ll save you the effort of clicking through: the good doctor estimates the probability of “your existing as you, today”. His estimate consists of (what else?) multiplying a bunch of raw probability estimates together without conditioning! And I’ll give you a hint as to the conclusion: the odds that you exist are basically zero! Astounding.
I should add that it seems he was forced to add a disclaimer that “It’s all an exercise to get you thinking…” and (obliquely) admit that the calculation is bogus at the end of the post, however.
Is there any branch of mathematics which is abused so extravagantly as probability? I think these sorts of abuses are beyond even the most egregious statistical claims, no?
Subscribe
Quantum Pontificem
The Quantum Cardinals
Quantum Announcements

Recent Posts
A nurse was mistakenly convicted of murdering her patients based on similar bad statistics.
And lest you think the legal system has learned from this mistake, a UK appellate judge ruled that Bayesian reasoning was inadmissible in court.
Hmmm … wasn’t it Fermi who first asked “Where is everyone?”
And noone’s complaining about his question. 🙂
A contemporary survey is Dave Waltham’s Testing Anthropic Selection: A Climate Change Example (Astrobiology 11(2): 105114, 2011).
But c’mon … what’s the probability that climatic instability might induce planetary ecological collapse? It’s kinda nutty even to consider it! 🙂
Richard Gill, the statistician who relentlessly criticized the shoddy case against Lucia de Berk (the nurse Aram mentions), has also written thoughtful papers about quantum foundations and Bell inequalities.
… and also one of the lucky few to have one of the Open Questions in Quantum Information (http://qig.itp.unihannover.de/qiproblems/Main_Page) resolved. Well, part A (of two parts):
http://qig.itp.unihannover.de/qiproblems/Bell_inequalities_holding_for_all_quantum_states
I meant part B.
There are 1146 characters in this post. They are all ASCII. The probability of this page is therefore (approximately) 1.4 times 10^(2760). That seems highly unlikely. Therefore I do not believe in this post.
It’s desirable (IMHO) to distinguish between mockery of bad methods (which is OK) and mockery of good questions (which is less obviously a good thing).
The question asked is concretely “What are the odds?” and answers are sought that broadly illuminate any aspect of human existence in the Anthropocene.
So let us reflect — without much regard for comedic potential — whether “What are the odds?” is a good question?
As has already been mentioned, Enrico Fermi was among the first to ask it; subsequently von John Neumann considered it in his 1955 essay “Can we survive technology?” (answer: maybe); and nowadays we have thoughtful essays like James Hansen’s “The runaway greenhouse effect” (per the YouTube video of that title).
It is notable that in subsequent decades, the attempt to give serious answers to the Fermi/ von Neumann/ Hansen questions have engaged multiple themes of quantum physics. It is notable too, that the Pontiff of Rome (for one among many) has directed his Pontifical Academy to consider this general class of questions and answers indepth.
Shall the Quantum Pontiffs follow the lead of Fermi/ von Neumann/ Hansen/ Benedict? What (if anything at all) does contemporary QIT have to say regarding “the odds” of present and future human existence?
If QIT is solely or mainly about mathematical analysis then perhaps the Magic Eight Ball’s sobering answers are appropriate: â€¢Â Reply hazy, try again â€¢ Ask again later â€¢ Better not tell you now â€¢ Cannot predict now â€¢ Concentrate and ask again â€¢ Don’t count on it â€¢ My reply is no â€¢ My sources say no â€¢ Outlook not so good â€¢ Very doubtful. 🙁
Yet if QIT is about more than proving theorems perhaps the Magic Eight Ball’s optimistic answers apply: â€¢ It is certain â€¢ It is decidedly so â€¢ Without a doubt â€¢ Yes — definitely â€¢ You may rely on it â€¢ As I see it, yes â€¢ Most likely â€¢ Outlook good â€¢ Signs point to yes â€¢ Yes! 🙂
True: there is a serious question about how to take an honestly Bayesian approach to hard questions like “how risky is this mortgagebacked security?” or “is there intelligent life on other planets?”
Challenges include model uncertainty, and the dependence of answers on the chosen prior, combined with the difficulty of formalizing the process by which we agree on a prior. These are great questions for science/social science/epistemology, and it’s true, I don’t have any great insights into answering them. But I can at least point out how not to approach them.
That’s true, Aram. Golly, perhaps the empirical fact that the answer to Fermi’s question “Where is everyone?” is not known to be “Sleeping in our guest bedroom” can be construed as Bayesian evidence for $latex P \ne NP$.
The reasoning being that if $latex P = NP$, then galaxyspanning universefilling starships would be *easy* to design, eh? And so our visitors would already be here! 🙂
In related news: patent for compressing random sources http://gailly.net/05533051.html
The best part is: “It is commonly stated that perfectly entropic data streams cannot be compressed. This misbelief is in part based on the sobering fact that for a large set of entropic data, calculating the number of possible bit pattern combinations is unfathomable. For example, if 100 ones and 100 zeros are randomly distributed in a block 200 bits long, there are
200C100 = 9.055 10^58
combinations possible. The numbers are clearly unmanageable and hence the inception that perfectly entropic data streams cannot be compressed. The key to the present compression method under discussion is that it makes no attempt to deal with such large amounts of data and simply operates on smaller portions.” Brain. Explode.
Oh, I like that: the shorter blocks have larger fluctuations, so a larger fraction of them have lowentropy empirical distributions. You don’t need big savings; just asymptotically compress 1 bit to 0.999999 bits (in expectation) and then you can iterate. Can anyone see the flaw? 🙂
This Quantum Pontiff column inspires reflection upon the computational difficulty of essaywriting. Essays upon the general theme of dumbness, e.g. “Here is an egregious mathematical misunderstanding that is instantiated in a dumb patent” can be constructed efficiently via cherrypicking searches and lintpicking critical analyses; such essays are (computationally speaking) among the easiest to construct.
Essays upon the opposite theme “Here is a deep mathematical/physical insight that is instantiated in a foresighted patent” are tougher to construct (but why exactly?) and yet we feel instinctively that this class of topics too is interesting (but why exactly?).
Searching the intellectual property literature for historical names yields Norbert Wiener’s US#2024900, John von Neumann’s US#2815488, and Philip Anderson’s US#3335363. And there are more recent oddball patents like US#5307410 “Interferometric quantum cryptographic key distribution system” by some guy named Charles Bennett, and US#5768297 “Method for reducing decoherence in quantum computer memory” by Peter Shor.
There is even a pending application with the intriguing title “Physical realizations of a universal adiabatic quantum computer” (US#12/098,348) … presumably this application represents the nearest approximation that the present art allows to an enabling disclosure, and thus illuminates the pace of progress subsequent to the mid90s patents of Bennett and Shor.
Bill GASARCH in his Computational Complexity weblog has taken to baldly predicting (in this week’s column for example) that “Quantum computers will never be practical.” A natural and important question for the Quantum Pontiff weblog to address is simply, is Bill right? 🙂
Stimulated in large measure by the ideas disclosed in Peter Shor’s 1998 patent “Method for reducing decoherence in quantum computer memory” (US#5768297), a panel of experiments met in January 2002 to produce version 1.0 of the nowcelebrated “Quantum Information
Science and Technology Roadmap” (LAUR026900) whose concrete tenyear goal (page ES1) was
The QIST Roadmap called for testbeds to include (by 2012) the experimental demonstration of concatenated errorcorrecting codes on systems of 50 physical qubits.
There are only six weeks left on the 2002 QIST Roadmap, and so now is a natural time to review lessonslearned from the past decade 20022011, and perhaps begin conceiving roadmaps for the next decade 20122022. To my mind, such reviews are most useful when they are mindful of the advice of historian John Toland
If the Quantum Pontiffs don’t review the past decade of quantum computing, who will? In the event that quantum computing does not review its past, are its future prospects diminished? Is the Fortnow/GASARCH weblog articulating an emerging scientific consensus when it predicts â€œQuantum computers will never be practicalâ€? Supposing that the “never practical” hypothesis is correct, then what should be the defining objectives and practical challenges of quantum information science?
These and similar questions are what I would like to see discussed here on The Quantum Pontiff.
As a concrete suggestion, perhaps during 2012 the Quantum Pontiffs might solicit guestposts on the general theme “Ten Years After QIST” from the eighteen members of the 2002 QIST Experts Panel: David Awschalom, Carlton Caves, Michael Chapman, Robert Clark, David Cory, Gary Doolen, David DiVincenzo, Artur Ekert, P. Chris Hammel, Richard Hughes, Paul Kwiat, Seth Lloyd, Gerard Milburn, Terry Orlando, Duncan Steel, Umesh Vazirani, K. Birgitta Whaley, and David Wineland.
If versions 1.0 and 2.0 of the QIST Roadmap (documents LAUR041778 and LAUR026900 respectively) were overoptimistic with respect to the narrow objective of errorcorrected quantum computing, these same documents were if anything underoptimistic with respect to the striking advances of the past ten years in our mathematical understanding of quantum coherence/decoherence, in our computational capabilities for simulating complex dynamical systems, and with respect to practical advances in strategically critical technologies like dynamic polarization, sensing, and spectroscopy.
It would be great to see these themes discussed on The Quantum Pontiff, with a view toward inspiring similarly striking advances in the coming decade.
Folks, there’s a memorable scene in the movie Apollo 13 in which flight engineer Sy Liebergot tells flight chief Gene Kranz: “Gene, the Odyssey is dying.”
To be blunt, the Theoretical Physics StackExchange is similarly atrisk: there are not enough questions, comments, and answers being posted to sustain a viable physics forum.
Is it really true that theoretical physics has too few questions and/ or too few people who care to ask/answer those questions, to sustain the Theoretical Physics StackExchange?
I’ve set myself the goal of asking/answering one or two physics questions a month; the point being that if every active researcher did this, the success of Theoretical Physics StackExchange would be assured … to the immense benefit (IMHO) of every active researcher.
My justposted question for this month is “How does one geometrically quantize the Bloch equations?”. If you’ve got answers to this or any other physics question, please post’em.