Lightning Rods Against God's Fury

I recently picked up Discarded Science: Ideas that seemed good at the time… by John Grant, a delightful little book about ideas in science that just didn’t pan out (hmmm….how many of my papers will be relegated to that dustbin? Ack, down that pathway lies depression.) When I was but a wee lad I spent many hours reading books about the Loch Ness monster, UFOs, and Bigfoot. (Not only did I learn to debug before I learned to program, apparently I learned pseudoscience before I learned science. Hmmm.) My experience with these early pseudoscience books has led me to think an important part of understanding science is to understand what is not science. Plus I am a real sucker for the kind of half baked arguments that are a stable of those pseudoscientific books. My favorite passage from the book so far is a classic illustration

…In November 1755 the most destructive earthquake ever to strike the northeastern US hit at Cape Ann, some 50km south of Boston. The Reverend Thomas Prince, of South Church, Boston, knew at once who was to blame: Benjamin Franklin (1706-1790), for having invented the lightning conductor. Before Franklin’s scheme of putting pointed metal rods on tall buildings had been universally adopted, God had been able to express His wrath by blasting something with lightning. Now that the presumptuous Franklin had taken that option away from Him, He was having to use earthquakes instead.

Which leads, of course, to the question of just how many tools of God has Science destroyed?

Debug First?

After Scott confessed to still programming in BASIC, I had a good time recalling how I first learned to program. My first interaction with a computer and “programming” was through LOGO, that cute little program for programming a little turtle to draw graphics. Drawing cool and crazy pictures was fun! But I don’t really remember learning that as “programming” so much as it was learning geometry (I was in the second grade at the time) and I certainly recall sharing the computer (probably not very sharefully.) But the real way I learned to program was not through LOGO but through debuging!
In the (g)olden days of my halycon youth you could buy computer magazines which had in the back of the magazine lines of BASIC to enter into your computer in order to play a game. Now I was all about playing games, and when we obtained at TRS-80 Color Computer, my parents only bought one game cartridge, so in order to play more games we had to type them into the computer. A group of neighborhood kids would get together and do this. One person would read and the other would type and others which check to make sure what was being typed was correct. Even with these precautions, however, we would, invariably make a mistake. So when you made a mistake you had to go looking through the program to find out where we made an error. Eventually, at first, we’d just go through the whole program line by line. But as we got more sophisticated we began to understand how, when the game died while doing X it meant that the program was probably at Y when this happened and so we began to understand what all that funny stuff we were typing in was. Eventually we got very good at debugging the code we typed into the computer. Indeed, in a very real sense, I learned to debug before I learned to code! (Years later I can still recall the astonishment of a high school teacher who watched me debug a program in a fairly rapid time 🙂 )
So here is a question. Sure we could teach students to program using your favorite language FAVLANG, but what if it was possible for us to teach debuging before we taught programming. What if the structure of the introduction to programming was centered around finding and fixing bugs. I mean, after reading Dreaming in Code, where the book ends with thousands of bugs left unfixed, I can only imagine that good debugging skills are important. Maybe even as important as coding skills? Just a thought. But even if this is not the way to go, how could you teach debugging before you taught programming? Now that’s fun to think about.

"Dreaming in Code" by Scott Rosenberg

Many of you have probably read the classic The Soul of a New Machine by Tracy Kidder which chronicles the building of a minicomputer at Data General in the late seventies. Earlier this week the torrent finally let up (no it didn’t stop raining) and spring break descended so I made a trip to the bookstore. There I found a book proclaiming on the back jacket to be the first true successor to The Soul of a New Machine, Dreaming In Code by the cofounder of Salon.com, Scott Rosenberg. Dreaming in Code chronicles the trials and tribulations of the creation of Chandler, an “interpersonal information manager that adapts to your changing needs”, which, if you visit the website linked above, you will discover is still at version 0.7. The book is an interesting, quick, if somewhat depressing read, I must say. Having myself never been involved in software development beyond my own hacking around with little personal programs, I found the picture painted of how code is developed to be interesting. Indeed, I can see in my own methods for programming many of the traits which are described as impeding good software development in a team setting: the desire to do it all yourself, the desire to reinvent instead of reuse, the desire to overdesign, etc etc. If nothing else, this book is probably a great read for anyone on the path towards becoming a software delevoper: not really a warning so much as a case study of the trials of tribulations of dreaming in code.

PRINT *, 'RIP, John Backus'

John Backus, who led the team that invented FORTRAN, has passed away. In a testimony to the staying power of FORTRAN, when I was doing an undergraduate research project in astrophysics in 1995, most of the code I delt with was written in FORTRAN.

The Great Wedding Diet of 2007

(Note, the below plot is indicative of a method for losing weight which not exactly healthy and is not recommended!) Results of the great wedding diet of 2007:
Great Wedding Diet of 07
The method? Little food and running four to five miles every morning. Funny how that works.
Okay, for old times sake, here is, for comparison, my last diet:
The Diet of 03-04

Quantum versus Classical: Exponent SMACKDOWN!

The world is quantum mechanical, damnit, and so (the party line goes) it shouldn’t be surprising if we find that a theory of computation based on quantum theory is more elegant than one based on machines whose concept of reality is so restrictive as to not allow cats both alive and dead. Okay, so we can say this half in jest, half seriously, but does it really hold any water? In order to solve this question of asthetics (joke about mathematics deleted), I propose that we hold a Quantum versus Classical beauty contest. Now since quantum computers are at least as powerful as classical computers, we can’t just say that an algorithm is better because it has a better runtime or better use of space, etc. Instead we need to use our artistic taste, i.e. pretend we have a clue about what it means for a result to be elegant (and yes, that was a dig at a certain popular science book 😉 )
Round 1
So I propose we perform this artistic measuring with a quantum versus classical exponent SMACKDOWN! (the exclamation mark is a part of the word.) Consider, for example, the recent quantum algorithm for evaluating a NAND tree (here and here with an explanation by the traveling complexity theory salesman here. Also the awesome NAND formula paper here) This quantum algorithm has a running time of [tex]$O(N^{1/2 + epsilon})$[/tex]. Now compare this with the best and optimal classical algorithm for this problem. The result? [tex]$O(N^{0.753 dots})$[/tex] where [tex]$0.753 dots approx log_2(1+sqrt{33})-2$[/tex]. Bleh. Round One of the Quantum Versus Classical Exponent SMACKDOWN! most certainly goes to the quantum world. One half is certainly better than something which is has both a logarithm and a square root of thirty three!
Round 2
But ha, say the classical complexity theorists. What about Grover’s problem? Sure in this unstructured query search the quantum world achieves a speedup from [tex]$O(N)$[/tex] classically to [tex]$O(N^{1/2})$[/tex] quantum mechanically, but look at your exponent. One half? Who the hell ordered one half. I mean if you had gotten log of N or even constant, then you would have something to brag about. But a square root speedup. Who ordered that? Round two of the Quantum verus Classical Exponent SMACKDOWN! most certainly goes to the classical world were weird square root speedups are not ubiquitous for straightforward unstructured query searches.
Round 3
To be continued!

The Physics of "All-In"?

Combining two excellent topics, physics and poker, physics/0703122:

Universal statistical properties of poker tournaments
Authors: Clément Sire
We present a simple model of Texas hold’em poker tournaments which contains the two main aspects of the game: i. the minimal bet is the blind, which grows exponentially with time; ii. players have a finite probability to go “all-in”, hence betting all their chips. The distribution of the number of chips of players not yet eliminated (measured in units of its average) is found to be independent of time during most of the tournament, and reproduces accurately Internet poker tournaments data. This model makes the connection between poker tournaments and the persistence problem widely studied in physics, as well as some recent physical models of biological evolution or competing agents, and extreme value statistics which arises in many physical contexts.

The Computer Recommends….

Scirate.com now has a simple recommending engine. Right now it’s very basic, but I hope to improve it and make it a little more interesting in the future.

Skepticism….Check. Axes….Ummm…

Bah (posted without long line of four letter words I would really like to print but am forced, by my good nature and good upbringing, to avoid printing on this family friendly blog.):

“Businesses aren’t too fascinated about the details of quantum mechanics, but academics have their own axes to grind. I can assure you that our VCs look at us a lot closer than the government looks at the academics who win research grants,” Martin said.

Note to D-wave. We aren’t skeptical that you built a device. We are skeptical that your path forward will ever work (some more skpetical than others…me I’m an optimist!) and we are even more skeptical of your statements trying to sell quantum devices by advertising unsubstantiated computational power. I also know VCs who looked closely at your company and said something very similar to what those lazy no good bum academics are saying.