Acronyms Beyond NISQ

NISQ is a term coined by John Preskill1 circa 2018 and stands for “Noisy Intermediate-Scale Quantum”. The term is aimed to describe quantum computers that were not just toy few qubit systems, but systems of a slightly larger scale. This “slightly larger” is a bit hard to define, but roughly most people take it as what is achievable with a quantum computer that does not use error correction. Or in other word the “intermediate” means roughly “what you can do with the natural fidelities of your qubits” (with a fudge factor for those who want to plug their nose and use error mitigation.)

Now this is a blog, so I will give my opinion. And that is that word intermediate in NISQ drives me nuts. I mean in part because it is vague (intermediate between what?), but more because the word itself is a disaster. Intermediate comes to us from Latin, being a combination of inter, meaning “between”, and medius meaning “middle”. But this is silly how can there be a middle without being between? It’s like saying “middle middle”. Whenever I hear NISQ I am reminded of this bastard doubling, and I start working on a time machine to go back in time and work on etymological corrections (good short story idea: a society of time travelers whose sole goal is to bring reason to the etymological record).

A more interesting question than my own personal hangups on word origins is what should we call what exists on the other side of intermediate. My friend Simone Severini has used the term LISQ which stands for “Logical Intermediate-Scale Quantum”2. The idea, as I understand it, is to use this term to refer to the era where we start to construct the first error corrected quantum devices. In particular it is the place where instead of using raw physical qubits one instead uses some logical encoding to build the basic components of the quantum computer. (<high horse>Of course, all qubits are encoded, there is always a physical Hamiltonian with a much larger Hilbert space at work, what we call a qubit subsystem is a good approximation, but it is always an approximation.</high horse>). I am exciting that we are indeed seeing the ideas of quantum error correction being used, but I think this obscures that what is important is not that a qubit is use error correction, but how well it does that.

I want to propose a different acronym. Of course, I will avoid the use of that annoying term intermediate. But more importantly I think we should use a term that is more quantitative. In that vein I propose, in fine rationalist tradition, that we use the metric system! In particular the quantity that is most important for a quantum computer is really the number of quantum gates or quantum instructions that one can execute before things fall apart (due to effects like decoherence, coherent imprecision, a neutral atom falling out of its trapping potential, or a cataclysmic cosmic ray shower). Today’s best perform quantum computations have gotten signal out of their machine while reaching somewhere near the 1000 gate/instruction level3. We can convert this to a metric prefix, and we get the fine “Kilo-Instruction Scale Quantum”. Today’s era is not the NISQ era, but the KISQ era.

And as we start to move up the scale by using error correction (or somehow finding natural qubits with incredible raw fidelities) we then start to enter the regime where instead of being able to run a thousand instructions we start to be able to run a million instructions. This till be the “Mega-Instruction Scale Quantum” era or MISQ era. And I mean how cool will that be, who doesn’t love to say the word Mega (Just don’t drawl your “e” or you might stumble into politics). Then we can continue on in this vein:

  • 103 instructions = KISQ (kilo) = NISQ
  • 106 instructions = MISQ (mega)
  • 109 instructions = GISQ (giga)
  • 1012 instructions = TISQ (terra) <– Shor’s algorithm lives around here

An objection to this approach is that I’ve replaced the word intermediate with the word instruction and while we gain the remove of the “middle middle”, we now the vague term instruction. The word origin of instruction is a topic for another day, but roughly it is a combination of “in” and “to pile up”, so I would argue isn’t doesn’t have as silly an etymology as intermediate. But more to the point, an “instruction” has only an imprecise meaning for a quantum computer. Is it the number of one and two qubit gates? What about measurements and preparations? Why are we ignoring qubit count or gate speed or parallelism? How do we quantify it for architectures that use resource states? To define this is to fall down the rabbit hole of benchmarks of quantum computers4. Benchmarking is great, but it always reminds me of a saying my grandfather used to tell me “In this traitorous world, nothing is true or false, all is according to the color of the crystal through which you look”. Every benchmark is a myopia, ignoring subtleties at the cost of quantiative precision. And yes, people will fudge any definition of instruction to fit the strengths of their quantum architecture (*ahem* algorithmic qubit *ahem*). But terms like NISQ are meant to label gross eras, and I think its okay to have this ambiguity.

One thing I do like about using the metric prefix is a particularly pressing problem. While it has been a real challenge to find NISQ algorithms that have “practical” (whatever that means5) an equally pressing problem is what sort of quantum algorithms will be achievable in the MISQ era. The place where we have the most confidence in the algorithmic advantage offered by quantum computers, simulation and experimental math algorithms (like factoring), lie above the GISQ and probably in the TISQ region. Roughly what we need are quantum algorithms that are linear time algorithms, so that for instances sizes becoming non-trivial (say a thousand), their total spacetime volume is a this size squared. And while there has been work on algorithms in this era, I would not say that we confidently have algorithms we know will be practically useful in MISQ. And this MISQ/GISQ gap is extremely scary!

So long live the NISQ era! And onward and up to MISQ and beyond!

  1. “Quantum Computing in the NISQ era and beyond”, Preskill arXiv/1801.00862 ↩︎
  2. “Bye NISQ. Hello LISQ?”, Simone Severini LinkedIn post ↩︎
  3. As an example “Phase transition in Random Circuit Sampling” by the Google group (of which I’m a member) shows a signal for circuits with 32 cycles and 67 qubits. arXiv/2304.11119 ↩︎
  4. A prominent benchmark is Quantum Volume, defined in “Validating quantum computers using randomized model circuits” by Cross, Bishop, Sheldon, Nation, and Gambetta arXiv/1811.12926. This is a fine benchmark modulo that because Executives at BigCo’s apparently can be fooled by log versus linear scale, they really should have taken the log of the quantity they use to define the Quantum Volume. ↩︎
  5. My own personal opinion is that current claims of “quantum utility” are an oversell, or what we nowadays call quantum hype, but that is a subject for a beer at a quantum beer night. ↩︎

8 Replies to “Acronyms Beyond NISQ”

  1. One thing I liked about NISQ and LISQ was it did draw attention to the sort of divide in what you’re designing your device to do; big NISQ devices aren’t generically good at QEC but a hypothetical LISQ device has to target some kind of logical qubit and have some QEC properties. I think this instructions-only eras sort of hide away that difference in design. I’m going to boldly predict that you can brute force KISQ and maybe MISQ but there’s literally no way to do GISQ or TISQ without QEC. The natural progression of KISQ,MISQ,GIST,TISQ lulls you into a false sense of security lol

  2. I’d say the natural progression up the metric scale is not a lullaby, but an ode to Moore’s law.

    But I don’t disagree that the properties of a LISQ device only partially overlap with NISQ, but I’d drop the I and the S as I don’t think they add anything (don’t get me started on the word “scale”!)

  3. I like this modest proposal! Two aspects in particular: it’s agnostic to the way to get to a certain number of instructions (i.e. fault-tolerance, vs error mitigation, vs just having really great fidelities – or a combination of all three); and it allows us to think of KISQ as “Keep It Simple, Quantum!”…

    And I think we do actually have some examples now of problems that can be solved with MISQ-era complexities (or lower). Just from projects I’ve worked on, we have materials modelling ( https://arxiv.org/abs/2205.15256 ), VQE/TDS for the Fermi-Hubbard model ( https://arxiv.org/abs/1912.06007 ) and spin models ( e.g. https://arxiv.org/abs/2108.08086 or many more). So I think that MISQ will be an exciting era!

    1. “Keep it Simple, Quantum!” is my new mantra. (I actually have a sign at my desk that says “Simplify”, it’s the only way my poor brain can keep up these days)

    2. Couldn’t agree more with you, Ashley. Hence, Quantum for Bio, as you well know, is focused on developing quantum algorithms in the MISQ regime.
      I use this acronym often, hope it sticks!

  4. Nice post, it is nice to see the Quantum Pontiff again on the air.  To the best of my memory, this is the first blog I heard about (from Greg Kuperberg in 2005), the first I participated in discussion, and it had very nice discussions.  

    In my 2016 paper “The quantum computer puzzle” I referred by “small scale” to what Preskill refers as “intermediate-scale”. (But NSSQ is less pronounceable than NISQ). In any case, I gladly moved to the NISQ acronym later (and I like the KIST, MIST refinements).

    I thought that understanding if quantum computers are feasible requires a multi-scale analysis. (My view is that the goals set for the small scale will already fail at the small scale 🙂 ) 

    Of course, with different terminology things go back at least to the 97 pro/con paper of Preskill where he writes: “the performance of devices with just a few tens of qubits cannot be easily simulated or predicted.” Which is, with later terminology of John, can be written as “NISQ devices (in fact even KIST devices) can demonstrate quantum supremacy’ ‘. (I think that NISQ devices in the 100 and more instructions range, are inherently unpredictable which is one reason why they cannot lead to supremacy.)

    P.S. Regarding the paper “Phase transition in Random Circuit Sampling”, I’d love to discuss it privately. Were you involved in this particular project, Dave? 

Leave a Reply

Your email address will not be published. Required fields are marked *