What To Do When There *Is* Nothing Else?

Michael Green’s appointment to replace Stephen Hawking as the Lucasian chair, has, quite predictably, brought back into the spotlight the ever simmering STRING WARS!!!OMG!!!STRINGTHEORYRLZ!!. Okay, maybe not the spotlight, per se, but I did find the article about Green in the Guardian interesting (via the so wrong it hurts fellow):

But that was one of their arguments, that the academy is so biased towards string theory – hiring mostly string theorists, crowning mostly string theorists – that it has driven out all other ways of seeing (Smolin compared it to deciding that there was only one way to fight cancer, and pouring all available resources into that one way). “People do what they feel is going to be productive,” says Green. “It’s all very well to say they should be doing something else. But there is nothing else.”

Now, of course, this is all part of a long series of arguments about the validity of string theory as an approach to a physical theory merging gravity and the standard model. Yawn, that is *so* 00s.
What it did make me think, however, was what the equivalent argument would be in a different field. And because, while I posses my fair share of extralusionary intelligence, I thought, oh I’d better stick to my own field when I think about this. So what would the equivalent be in quantum computing?
I hereby declare that there are only two valid approaches to building a quantum computer: ion trap quantum computers and superconducting based quantum computers. It’s all very well to say that we should be spending our time working on other “ideas” for quantum computers. But there is nothing else.

ArXiv in the Cloud Coming?

Via the arXiv api newsgroup comes the rumor that soon, perhaps, the arXiv will be available for full download sometime in the future:

or a full copy of (or particular subsets of) PDF for arXiv papers, we are in the process of setting up a service in the Cloud, which will offer the option for bulk download. I’ll let you know when that
becomes available.

Cool! All of physics since recorded arXiv time on my hard drive 🙂

Quantum LSD

Oh man sometimes even I, a staunch Caltech grad, wish I could be at MIT. The MIT QIP seminar this next Monday looks…intriguing (Monday 10/26 at 4:00 in 36-428 silly MITers and their numbered buildings, so cold.):

David Kaiser (MIT)
How the Hippies Saved Physics
Abstract:
In recent years, the field of quantum information science-an amalgam of topics ranging from quantum encryption, to quantum computing, quantum teleportation, and more-has catapulted to the cutting edge of physics, sporting a multi-billion-dollar research program, tens of thousands of published research articles, and a variety of device prototypes. This tremendous excitement marks the tail end of a long-simmering Cinderella story. Long before the big budgets and dedicated teams, the field smoldered on the scientific sidelines. In fact, the field’s recent breakthroughs derive, in part, from the hazy, bong-filled excesses of the 1970s New Age movement. Many of the ideas that now occupy the core of quantum information science once found their home amid an anything-goes counterculture frenzy, a mishmash of spoon-bending psychics, Eastern mysticism, LSD trips, CIA spooks chasing mind-reading dreams, and comparable “Age of Aquarius” enthusiasms. For the better part of two decades, the concepts that would,in time, blossom into developments like quantum encryption were bandied about in late-night bull sessions and hawked by proponents of a burgeoning self-help movement-more snake oil than stock option. This talk describes the field’s bumpy transition from New Age to cutting edge.

I knew that the hippies drove the computer revolution but did not know that they are also responsible for quantum information science 🙂

Bayesians Say the Cutest Things

The Dutch book argument of Bruno de Finetti is an argument which is supposed to justify subjective probabilities. What one does in this argument is gives probabilities an operational definition in terms of the amount one is willing to bet on some event. Thus a probability p is mapped to your being willing to make a bet on the event at 1-p to p odds. In the Dutch book argument one shows that if one takes this operational meaning and in addition allows for the person you are betting to take both sides of the bet, then if you do not follow the axiomatic laws of probability, then the person betting against you can construct a Dutch book: a set of bets in which the person you are betting against always wins. For the best explanation and derivation of this result that I know, consult the notes written by Carl Caves: Probabilities as betting odds and the Dutch book.
Now I have many issues with the Dutch book argument, the first and foremost being that it is a ridiculous setup. I mean how often do you place a bet in which you are willing to give both sides of the bet (buy and sell)? “Yes, I would like to either buy or sell a lottery ticket please?” Sure you can do it, but there are many reasons why money has a value outside of the single bet being placed, and therefore buying (giving someone your money and getting paid back if you win the bet) versus selling (recieving money and then having to pay off the bet if you lose) are not symmetric in any world where the unit being exchanged has a temporal value and the bet is placed before the event is resolved. I am, indeed, a one-sided Bayesian. I will leave it up to the reader to construct the axioms of probability by which I work.
Amusingly, at least to me, this objection does not seem to be raised much in the literature on the Dutch book argument. But the other day I found a great quote relevant to this objection which I just have to share. This is from Artificial Intelligence: A Modern Approach by Russell and Norvig. In this book they discuss but don’t prove the de Finetti’s argument. Then they say

One might think that this betting game is rather contrived. For example, what if one refuses to bet? Does that end the argument? The answer is that the betting game is an abstract model for decision-making situation in which every agent is unavoidably involved at every moment. Every action (including inaction) is a kind of bet, and every outcome can be seen as a payoff of the bet. Refusing to bet is like refusing to allow time to pass.

You heard it here first people: if you want to stop time all you have to do is not bet! Crap I have homework due tomorrow what should I do? Well certainly you should not bet, because we all know that refusing to bet is refusing to allow time to pass. ROFL Baysians are so cute when they try to justify themselves.

Chairs

Two notes on chairs. Michael Green is the new Lucasian chair of Mathematics replacing the esteemed Stephen Hawking. Green helped sparked the great optimism in string theory by discovering with John Schwarz the Green-Schwarz anomaly cancellation mechanism.
Elsewhere, the Perimeter Institute has named ten new distinguished research chairs, among them a host of the quantum computing afflicted:

Dorit Aharonov is a Professor in the Department of Computer Science and Engineering at Hebrew University in Jerusalem. She has made major contributions to the theoretical foundations of quantum computation, in particular in the context of understanding and counteracting the effects of ‘noisy’ environments on delicate quantum systems performing computations, the identification of a quantum to classical phase transition in fault tolerant quantum computers, the development of new tools and approaches for the design of quantum algorithms, and the study of ground states of many body quantum Hamiltonians for various classes of Hamiltonians, from a computational complexity point of view. In 2006 she was awarded the Krill prize for excellence in scientific research. Dr. Aharonov is on the faculty of Perimeter Scholars International.
Patrick Hayden holds the Canada Research Chair in the Physics of Information at McGill University. His research focuses on finding efficient methods for performing the communication tasks that will be required for large-scale quantum information processing. This includes the development of methods for reliably sending quantum states through ‘noisy’ media and for protecting quantum information from unauthorized manipulation. He has also applied these techniques to the question of information loss from black holes. Among Dr. Hayden’s honors, he is a past Alfred P. Sloan Foundation Fellow and Rhodes Scholar.
Christopher Isham is a Senior Research Investigator and Emeritus Professor of Theoretical Physics at Imperial College London. He is a former Senior Dean of the College. Dr Isham has made many important contributions in the fields of quantum gravity and the foundations of quantum mechanics. Motivated by the ‘problem of time’ in quantum gravity, he developed a new approach to quantum theory known as the ‘HPO formalism’ that enables the theory to be extended to situations where there is no normal notion of time (such as in Einstein’s theory of general relativity). Since the late 1990s, Dr. Isham has been developing a completely new approach to formulating theories of physics based on the mathematical concept of a ‘topos’. This gives a radically new way of understanding the traditional problems of quantum theory as well as providing a framework in which to develop new theories that would not have been conceived using standard mathematics. From 2001-2005, Dr. Isham was a member of Perimeter Institute’s Scientific Advisory Committee; during the last year he was the Chair of the Committee.
Leo Kadanoff is a theoretical physicist and applied mathematician based at the James Franck Institute at the University of Chicago. He is considered a pioneer of complexity theory, and has made important contributions to research in the properties of matter, the development of urban areas, statistical models of physical systems, and the development of chaos in simple mechanical and fluid systems. His is best known for the development of the concepts of “scale invariance” and “universality” as they are applied to phase transitions. More recently, he has been involved in the understanding of singularities in fluid flow. Among Dr. Kadanoff’s many honours, he is a past recipient of the National Medal of Science (US), the Grande Medaille d’Or of the Académie des Sciences de l’Institut de France, the Wolf Foundation Prize, the Boltzmann Medal of the International Union of Pure and Applied Physics, and the Centennial Medal of Harvard University. He is also a past President of the American Physical Society. Dr. Kadanoff is on the faculty of Perimeter Scholars International.
Renate Loll is a Professor of Theoretical Physics and a member of the Institute for Theoretical Physics in the Faculty of Physics and Astronomy at Utrecht University. Her research centers on quantum gravity, the search for a consistent theory that describes the microscopic constituents of spacetime geometry and the quantum-dynamical laws governing their interaction. She has made major contributions to loop quantum gravity, and with her collaborators, has proposed a novel theory of Quantum Gravity via ‘Causal Dynamical Triangulations.’ Dr. Loll heads one of the largest research groups on nonperturbative quantum gravity worldwide, and is the recipient of a prestigious personal VICI-grant of the Netherlands Organization for Scientific Research. She is also a faculty member of Perimeter Scholars International.
Malcolm Perry is a Professor of Theoretical Physics in the Department of Applied Mathematics and Theoretical Physics at the University of Cambridge and a Fellow of Trinity College, Cambridge. His research centers upon general relativity, supergravity and string theory. Dr. Perry has made major contributions to string theory, Euclidean quantum gravity, and our understanding of black hole radiation. With Perimeter Institute Faculty member Robert Myers, he developed the Myers-Perry metric, which shows how to construct black holes in the higher spacetime dimensions associated with string theory. Dr. Perry’s honours include an Sc. D. from the University of Cambridge. Dr. Perry is also on the faculty of Perimeter Scholars International.
Sandu Popescu is a Professor of Physics at the H. H. Wills Physics Laboratory at the University of Bristol, and a member of the Bristol Quantum Information and Computation Group. He has made numerous contributions to quantum theory, ranging from the very fundamental, to the design of practical experiments (such as the first teleportation experiment), to patentable commercial applications. His investigations into the nature of quantum behavior, with particular focus on quantum non-locality, led him to discover some of the central concepts in the emerging area of quantum information and computation. He is a past recipient of the Adams Prize (Cambridge), and the Clifford Patterson Prize of the Royal Society (UK).
William Unruh is a Professor of Physics at the University of British Columbia who has made seminal contributions to our understanding of gravity, black holes, cosmology, quantum fields in curved spaces, and the foundations of quantum mechanics, including the discovery of the Unruh effect. His investigations into the effects of quantum mechanics of the earliest stages of the universe have yielded many insights, including the effects of quantum mechanics on computation. Dr. Unruh was the first Director of the Cosmology and Gravity Program at the Canadian Institute for Advanced Research (1985-1996). His many awards include the Rutherford Medal of the Royal Society of Canada (1982), the Herzberg Medal of the Canadian Association of Physicists (1983), the Steacie Prize from the National Research Council (1984), the Canadian Association of Physicists Medal of Achievement (1995), and the Canada Council Killam Prize. He is an elected Fellow of the Royal Society of Canada, a Fellow of the American Physical Society, and a Fellow of the Royal Society of London, and a Foreign Honorary Member of the American Academy of Arts and Science.
Guifre Vidal is a Professor in the School of Physical Sciences at the University of Queensland, who has made important contributions to the development of quantum information science, with applications to condensed matter theory. His research explores the phenomenon of entanglement, the renormalization group, and the development of tensor network algorithms to simulate quantum systems. Dr. Vidal’s past honors include a Marie Curie Fellowship, awarded by the European Union, and a Sherman Fairchild Foundation Fellowship. He is a Federation Fellow of the Australian Research Council.
Mark Wise is the John A. McCone Professor of High Energy Physics at the California Institute of Technology. He has conducted research in elementary particle physics and cosmology, and shared the 2001 Sakurai Prize for Theoretical Particle Physics for the development of the ‘Heavy Quark Effective Theory’ (HQET), a mathematical formalism that enables physicists to make predictions about otherwise intractable problems in the theory of the strong interactions of quarks. He has also published work on mathematical models for finance and risk assessment. Dr. Wise is a past Sloan Foundation fellow, a fellow of the American Physical Society, and a member of the American Academy of Arts and Sciences and of the National Academy of Sciences.

Grape Crush Time

The grapes have been picked upand the fermenting has (hopefully!) begun. This year I’m trying two types of grapes, Cabernet Sauvignon and Sangiovese.

Machine Learning Ruins Blackjack

Blackjack, or 21, is a game that many enjoy wasting their money playing at casinos. For those who don’t like to waste their money, or at least want to waste it more slowly than others, card counting is a time honored tradition for moving the odds away from the casino and in the players direction (blessed be Ed Thorp.) In other words it makes the game at least slightly enjoyable for those who like to win. But now a graduate of the University of Dundee, Kris Zutis, is going to ruin this small smidgen of fun:

A University of Dundee graduate has created a computer system with the potential to make the game of Blackjack fairer by detecting card counters and dealer errors.

Okay so catching dealer errors certainly makes the game “more fair.” But detecting card counters? People who are eking out a minor advantage (and have to be aware of methods to avoid detection because casinos can kick them out not because of card counting per se, but because the casinos run the game) by using their damn brains are not acting fair? To be fair, of course casinos are already doing this so we should be nice to the grad student 🙂
And further, of course all is fair in love, war, and casino games. But this makes me wonder about arbitrage in the era of machine learning, each machine vying to outdo the other in keeping their profits locked up tight. My high margin classifier just gave me 21, yipee! Oh wait, this is already happening on Wall Street. Remind me again about the market making and liquidy arguments for blackjack.