Today is the final exam for the course I’ve been teaching this summer. So I need some reading material for when I’m not watching the students take their exam. Here are two fun ones I just downloaded (one via Alea):

arXiv:0803.3913:

The Reverse of The Law of Large Numbers

Authors: Kieran Kelly, Przemyslaw Repetowicz, Seosamh macReamoinn

Abstract:The Law of Large Numbers tells us that as the sample size (N) is increased, the sample mean converges on the population mean, provided that the latter exists. In this paper, we investigate the opposite effect: keeping the sample size fixed while increasing the number of outcomes (M) available to a discrete random variable. We establish sufficient conditions for the variance of the sample mean to increase monotonically with the number of outcomes, such that the sample mean “diverges” from the population mean, acting like an “reverse” to the law of large numbers. These results, we believe, are relevant to many situations which require sampling of statistics of certain finite discrete random variables.

and

arXiv:0806.0485

Complex and Unpredictable Cardano

Aurthor: Artur Ekert

Abstract: This purely recreational paper is about one of the most colorful characters of the Italian Renaissance, Girolamo Cardano, and the discovery of two basic ingredients of quantum theory, probability and complex numbers. The paper is dedicated to Giuseppe Castagnoli on the occasion of his 65th birthday. Back in the early 1990s, Giuseppe instigated a series of meetings at Villa Gualino, in Torino, which brought together few scattered individuals interested in the physics of computation. By doing so he effectively created and consolidated a vibrant and friendly community of researchers devoted to quantum information science. Many thanks for that!

The Cardano one looks really good (and its written by Ekert). Cardano was an interesting guy and many of the rules of probability he discovered were motivated by his penchant for gambling.

Ya know, that Kelly et al paper sounds vaguely relevant to the question “What happens to quantum tomography when you start increasing the dimension of the system?” There’s a sense in which the number of outcomes is increasing, but it would be more accurate to say that the number of dimensions for the parameter that you’re estimating is increasing.

Is there anything surprising & relevant to that topic in the paper? Or is what I’m mentioning too straightforward to have anything interesting said about it?

I mean, I could read the paper… but it’s funner to just ask…