Qubiting for Dollars

Rod Van Meter makes the prediction that the first production quantum computer will cost forty bucks a qubit. He arrives at this number by assuming the first application will be factoring a 1,024 bit number which requires about five kilobits (kiloqubits?) and adds in a factor of fifty for quantum error correction. Thus there is a total of one quarter of a million qubits and figures such a machine could cost around ten million for the figure of forty bucks a qubit. It’s interesting to reason about this by first setting the cost instead of by estimating the actual costs. Why I like this approach is that if we suppose that a similar quantum computer costs ten times as much, then we simply won’t be building such a computer. Of course, I’m not sure how much the NSA would pay for a quantum computer: it may indeed be in the hundred million dollar range if factors 1,024 bit numbers.
Of course, I don’t believe in quantum error correction…or rather I don’t believe we will use the standard approach to quantum error correcting to obtain a fault-tolerant quantum computer 😉 . Thus I’m betting the first quantum computer will cost around a dollar a qubit (although I certainly think some of these “qubits” may in fact be made up of hundreds to thousands to millions to 10^19 number of single quantum systems.)
It will be interesting to see how far off Rod and I are at date Q. I suspect Rod will be closer mostly because he’s actually worked in the real world.
Update: For comparison the ENIAC cost about half a million dollars in 1945 which is about five million dollars in today’s money. The total number of vacuum tubes in that monster was around 20,000. Thats 500 of today’s dolars per vacuum tube. And, of course, today, I can buy a computer with 50 million more than a billion transistors for under five hundred bucks!

4 Replies to “Qubiting for Dollars”

  1. Dave, the only computer you can buy these days that has 50M transistors is a PDA or a cell phone. Low-end CPUs might be less than 50M, but don’t forget that DRAM, at a transistor per bit. The $1300 laptop I’m typing this on has more than six billion transistors in it. Yale Patt, one the premiere architecture researchers, was giving talks a decade ago on what to do with a billion transistors on a CPU chip. We’re practically there…

  2. I think you are way off dave. 200 grand is super cheap. That’s less than the annual operating budget of an average-level optics lab (and a lot less if you have to pay for your grad students).
    Even 10 million looks relatively cheap, even without error-correction. Again, using optics as an example, check out the price of, parametric down-converters, single-photon detectors, etc… each devisce is on the order of 10’s of thousands of dollars. If you were going to make a factoring machine with these devices, you are going to need a lot of these sort of things. It’s going to get expensive, fast.
    It’s true though that these prices aren’t static, and optics isn’t a great example. However, I’m guessing any device that doesn’t require error correction won’t be made out of cheap materials.
    I’m also not even going to start talking about labour costs…
    Maybe I’m a pessimist…

Leave a Reply

Your email address will not be published. Required fields are marked *