In a previous post I presented the skeptical views toward quantum computing of Robert Alicki, Michel Dyakonov, Leonid Levin, Oded Goldreich, Liam McGuinness, Moshe Vardi and a few others. As I mentioned there, I am quite interested in understanding the the diverse perspectives by other people on this issue. Let me add that I am also very interested in connections with mathematics. (Majorana zero modes, discussed in my previous post is an example of an area with rich mathematical connections.)
In this post, I briefly describe and give links to my own skeptical view. I move on to give descriptions of the debate and counter arguments by John Preskill, Aram Harrow, Scott Aaronson, Dave Bacon, Boaz Barak and others. Finally, I add a few thoughts on the connections between my arguments and the physical skeptical arguments and include a roadmap to the debate from my perspective. (This roadmap was also presented in a separate post and is slightly expanded over here.)
A quick update about Microsoft’s claims of creating topological qubits. Sergey Frolov and Vincent Moudrik (who had major contributions in scrutinizing earlier claims in this area) will have a live videotaped discussion on February 28, 2025 on the topic “How close we are to a topological qubit?” Here is the Zoom registration link.
My own theory that quantum error-correction is not possible.
From my perspective, the question of whether quantum computation is possible is among the most important open scientific questions of our time. I hold the view that building scalable quantum computers—and even achieving some early milestones on the path to quantum computing—is impossible. Given the optimistic consensus among many scientists, I believe the burden of proof lies with skeptics like myself. This necessitates continuous examination and refinement of our arguments and the exploration of new ways to articulate and develop them.
My argument against quantum computing is:
Empirical probability distributions that quantum computers in the intermediate scale produce are combinations of noise-sensitive distributions that are useless for computation, and computationally very primitive robust distributions. This combination does not support, for larger scales, robust quantum information but it does support robust classical information. It cannot lead to quantum supremacy nor to good quality quantum error correcting codes.
Noise sensitivity that plays a role in my argument is a mathematical theory developed by Benjamini, Schramm and myself in the late 90s. (It is arguably related to chaotic phenomena in physics.)
Here are some links.
(i) G. Kalai, The argument against quantum computers, Quantum, Probability, Logic: Itamar Pitowsky’s Work and Influence (M. Hemmo and O. Shenker, eds.), pp. 399–422, Springer, 2019.
(ii) G. Kalai, The quantum computer puzzle, Notices AMS, May 2016. (Expanded version.)
My papers after 2014 relies strongly on my study of noisy Boson sampling with Guy Kindler.
(iii). G. Kalai and G. Kindler, Gaussian Noise Sensitivity and BosonSampling, 2014.
The general arguments, various connections to mathematics and TOC, and a preliminary study of the 2019 Google’s experiment can be found in this paper
(iv). G. Kalai, The argument against quantum computers, the quantum laws of nature, and Google’s supremacy claims, The Intercontinental Academia Laws: Rigidity and Dynamics (M. J. Hannon and E. Z. Rabinovici, eds.), World Scientific, 2024. arXiv:2008.05188, 2020.
In the wider context of “mathematical computer science: theory and practice”
(v). G. Kalai, Three puzzles on mathematics computations, and games, Proc. ICM2018;
(vi) A 2011 paper summarizing my work on correlated errors. How Quantum Computers Fail: Quantum Codes, Correlations in Physical Systems, and Noise Accumulation
(vii) My first 2005 paper on the topic was: G. Kalai, Thoughts on Noise and Quantum Computation
Blog posts: The Argument Against Quantum Computers – A Very Short Introduction (2020) ; Inaugural address at the Hungarian Academy of Science: The Quantum Computer – A Miracle or Mirage (2022). Movies: Why Quantum Computers Cannot Work: The Movie! (2014); Cartoons: If Quantum Computers are not Possible Why are Classical Computers Possible? (2017); Interview: Quanta Magazine. A recent blog post that led to interesting discussion: Seven Assertions about Quantum Computing.
Challenges to My Skeptical View
As I mentioned at the beginning of this post, I hold the view that building scalable quantum computers—and even achieving some of the early milestones in quantum computing—is impossible. These early milestones include certain claims of “quantum supremacy” that have already been made, but which, in my view, do not hold water. Another early milestone which I expect to be beyond reach is the creation of logical qubit and gate of very good quality.
The roadmap to this debate outlined below, includes three major aspects of my research (items 4, 5 and 6):
- The claim that “above the fault-tolerance threshold,” systematic relationships will emerge between the desired quantum state and the noise.
- The claim that samples produced by NISQ computers represent a low-level computational complexity class, LDP (as identified by Guy Kindler and me), and that such samples inherently exhibit noise-sensitive behavior.
- In addition, samples from NISQ computers will include a large noise sensitive component. This means that the empirical distribution is inherently non-stationary and unpredictable.
These directions present intriguing mathematical challenges, and exploring their connections to various issues in quantum physics—including those raised by other quantum computing skeptics—would be valuable. Testing my perspective and specific conjectures against experimental progress remains an ongoing challenge. Some of the specific predictions I have made can already be tested with current experiments.
Furthermore, critically examining current experimental work is itself a worthwhile endeavor. This scrutiny touches on significant technical and conceptual statistical problems and intersects with the broader and crucial issue of the “replication crisis” in various scientific fields.
Two frequently asked questions
Q1: What is the computational complexity class LDP?
A1: LDP is the class of probability distributions P on o-1 vectors of length n, that can be approximated by polynomials of bounded degree D. This class is a subclass of probability distributions that can be approximated by
circuits (bounded depth circuits). It is a very tiny subclass of the class BPP of probability distributions that can be approximated by classical computation.
Q2: If indeed the empirical distribution is inherently non-stationary and unpredictable couldn’t it still represent quantum supremacy? It looks that noise makes simulating on a classical computer even harder!
A2: Consider, for concreteness, samples of bitstrings, 0-1 vectors of length n. A nonstationary and unpredictable behavior of the samples means that the individual bitstrings in a sample are functions of many unknown parameters of the noise. If every bitstring depends, among other things, on the temporal parameters of the noise, then the complexity of the sample—measured in terms of input size—fails to demonstrate any computational advantage.
This commonly asked question illustrates how, once someone gets into the mindset of believing in miraculous, naturally occurring quantum computational supremacy that simply needs to be “harnessed,” it becomes difficult to abandon this perspective.
_______________
I move now to present arguments by proponent of quantum computers about the debate. John Preskill is a quantum computing proponent who has offered thoughtful and balanced descriptions of the “pros and cons” in some of his works.
Papers by John Preskill
Following are links to some very useful papers by John Preskill. Preskill’s homepage contains a lot of interesting lectures, links and other interesting material, and so is the group blog “Quantum Frontiers“.
p.1 Quantum computers pro and con (1998). This paper passes the “test of time” with flying colors and it includes a lot of interesting issues and ideas. The paper was, in part, a response to the 1996 paper of S. Haroche, S. and J. M. Raimond, Quantum computing: dream or nightmare? (It is on pages 341-344 of the following full proceedings of an interesting 1996 conference.) One theme of the paper is the question regarding the abilities of quantum computers with “a few tens of qubits” (NISQ in modern language).
Among a lot of other interesting assertions, Preskill proposed the following statement: “The performance of devices with just a few tens of qubits cannot be easily simulated or predicted.”
One way to interpret this statement (in modern terms later coined by Preskill himself) is that “NISQ computers can lead to quantum computational supremacy”—an idea that has been widely promoted and studied since around 2010. (This is likely what John had in mind.) Another interpretation of the very same statement is that NISQ computers produce inherently unpredictable empirical distributions, which is a key ingredient of my theory.
p.2 Quantum computing and the entanglement frontier (2012). This is a very interesting paper where (among other things) the term “quantum supremacy” was introduced.
p.3 Sufficient condition on noise correlations for scalable quantum computing (2013) This paper responds to concerns by Robert Alicki and myself regarding inherently correlated errors.
p.4 Quantum computing in the NISQ era and beyond (2018). In this hugely cited paper the word “NISQ” ( for “noisy intermediate-scale quantum”) was invented. Let me mention that a two-scale analysis of the computational power of quantum circuits is also a crucial ingredient in my 2016 paper on quantum computation (and subsequent papers).
Seeing John Preskill at the Google “ten septillions over classical” announcement video reminded me the words of the Hebrew song “לאן התגלגלת יא דודו” (Where did you get yourself into, ya Dudu) 🙂 .
Next we give some details and links on my debate with Aram Harrow, on Dave Bacon’s early writings, on Scott Aaronson’s views, and on a 2017 blogpost by Boaz Barak.
The 2012 Kalai-Harrow debate and other Internet Discussions
My 2012 debate with Aram Harrow was kindly hosted by Ken Reagan and Dick Lipton on GLL.
Here are the links to the eight posts: My initial post “Perpetual Motion of The 21st Century?” was followed by three posts by Aram. The first “Flying Machines of the 21st Century?” the second “Nature does not conspire” and the third “The Quantum super-PAC.” We then had two additional posts “Quantum refutations and reproofs” and “Can you hear the shape of a quantum computer?.” Finally we had two conclusion posts: “Quantum repetition” and “Quantum supremacy or classical control?” Some general issues raised in the discussion are listed in this comment, and these follow-ups (1,2,3,4,5,6).
Cris Moore wrote in the very first comment of the debate:
It’s reasonable to take as a physical postulate (…) that NP-complete problems can’t be solved by a physical device in polynomial time (…). But I don’t see why the factoring problem deserves this status.
Sampling according to the value of permanents, a task which is outside the polynomial hierarchy, should raise the eyebrows even of those who are not impressed by factoring! (And, going outside PH, can be achieved already by bounded depth quantum circuits.)
There were additional discussions on GLL, SO, my blog, and other blogs. Here are a few selected posts with interesting discussions: A Few Slides and a Few Comments From My MIT Lecture on Quantum Computers;(2013) Why Quantum Computers Cannot Work: The Movie! (2014); Quantum Supremacy and Complexity (2016); I wrote several posts summarizing my debate with Aram, where I tried to present the different participants, different point of views, and amusing moments: I, II, III.
Scott Aaronson’s Writings and Views
In 2004 Scott wrote an interesting paper Multilinear formulas and skepticism of quantum computing. The paper calls skeptics to offer “sure/Shor” separators, namely, quantum states that are possible with a quantum computers but beyond reach according to skeptical views.
Since the end of 2005, Scott’s blog “Shtetl Optimized” (SO) includes many interesting discussions on matters related to quantum computation. Scott’s lecture about skepticism of Quantum Computing (later turning into a chapter in his book Quantum Computing Since Democritus) is discussed in this 2008 SO post. Another interesting post and discussion thread from 2006 is: Reasons to believe II: quantum edition. There, Scott started out by repeating (“for the ten billionth time”) the Official Scott Aaronson Quantum Computing Position Statement:
- It’s entirely conceivable that quantum computing will turn out to be impossible for a fundamental reason.
- This would be much more interesting than if it’s possible, since it would overturn our most basic ideas about the physical world.
- The only real way to find out is to try to build a quantum computer.
- Such an effort seems to me at least as scientifically important as (say) the search for supersymmetry or the Higgs boson.
- I have no idea — none — how far it will get in my lifetime.
Since 2010 or so Scott raised and strongly promoted the idea that NISQ computers (specifically boson-samplers) can demonstrate quantum computational supremacy. The argument for why this is impossible is central to my own research since 2013. For people who can digest the idea of efficient quantum factoring, bosonsampling provides something even harder to digest: Precise sampling according to values of permanents – a task outside of the polynomial hierarchy! (This is item 9 in our roadmap.)
In recent years, Scott has expressed more and more confidence that the skeptical view has been refuted. See, e.g., his post: My “Quantum Supremacy: Skeptics Were Wrong” 2020 World Speaking Tour. Scott also argues that “20 years of covering hyped, dishonestly-presented non-milestones in quantum computing” gives more credibility to his support of what he regards as “real milestones.”
Scott sometimes expresses his opinions in an emotional or unrestrained manner. His writings and lectures sometimes include exaggerations or inaccuracies, often to emphasize his perspective or for comedic effect. While these features attract interest and attention, they can pose difficulties in technical discussions.
Boaz Barak’s 2017 Post
Boaz Barak wrote in 2017 an interesting post The different forms of quantum computing skepticism. Boaz’s post is centered around a description of four scenarios (or possible “worlds” in the style of Russel Impagliazzo) about the matter. Boaz’ world are called Superiorita (quantum computers are realistic and computational superior), Popscitopia (Quantum computers are realistic and can solve NP-complete problems), Skepticland (quantum computers are not realistic), and Classicatopia (quantum algorithms can efficiently be simulated on classical computers). The comment thread also contains a distinction between quantum computer believers who regard quantum fault-tolerance as the basic reason for the ability to overcome noise and those believers (“fundamentalists”) who regard quantum fault-tolerance as a “red herring” and think that, in principle, the noise could be reduced to the extent that we could run Shor’s algorithm quantum computers with no need of quantum fault-tolerance. Boaz concluded his post by stating where he stands noting that a decade earlier he had the opposite stance. In his 2017 post Boaz regarded quantum computing as the beautiful choice. In addition he believed that noise reflects only engineering difficulties that are not fundamental barriers and that “with sufficient hard work and resources the noise can be driven down to as close to zero as needed”. I reblogged Boaz’s post and added some commentary here.
Let me also mention that Or Sattath and Uriel Shinar wrote a paper relating QKD (quantum key distribution schemes) and quantum computing skepticism. They argued that the inability for quantum memory (referred by Or and Uriel as “Quamnesia”) could be a blessing for certain cryptographic tasks!
Boaz himself wrote and hosted several interesting posts about computation and physics and let me mention especially his enlightening post on black holes.
Dave Bacon’s old posts over The Quantum Pontiff
Early on, there were interesting discussions on skepticism of quantum computers on Dave Bacon’s blog “The Quantum Pontiff”. I will only mention a few posts.
b.1 Here is a 2006 post titled What Are the Assumptions of Classical Fault-Tolerance?
Dave concluded his posts with the following suggestion:
I just want to suggest that the more we learn about the theory of fault-tolerant quantum computation, the more we recognize its similarity to probablistic classical computation. I call this general view of the world the “principal of the ubiquitous factor of two in quantum computing.” The idea being mostly that quantum theory differs from probablistic theory only in the necessity to deal with phase as well as amplitude errors, or more generally, to deal with going from probabilities to amplitudes.
b.2 My participation
I believe this was the first time I participated in a blog discussion, and I presented a couple early conjectures and intuitions in the comment section over Dave’s blog. Much of the discussion was about Dave’s question on why classical computation is possible. In one comment I raised the intuition that the noise for complex states inherently carries similar complexity, and in the following comment I offered a mathematical formulation. (I tried to put this intuition on rigorous mathematical grounds in the following years.)
b.3 The paper of Alicki, Lidar, and Zanardi.
The main argument put forth in this paper is that if we want to have a quantum computer with Markovian noise as is assumed in many (but not all) of the threshold calculations, then this leads to the following contradictions. If the system has fast gates, as required by the theorem, then the bath must be hot and this contradicts the condition that we need cold ancillas. If, on the other hand, the bath is cold, then the gates can be shown to be necessarily slow in order to get a valid Markovian approximation.
On the Scope of Certain Physical Laws in a World Without Quantum Fault Tolerance
Here are a few thoughts on how my arguments, introduced at the beginning of this post, connect with those of Robert Alicki and other skeptical physical arguments. As I said before, I do not expect a mathematical theorem proving the impossibility of quantum computers, nor do I foresee a derivation of their impossibility from an agreed-upon physical principle. However, in a world devoid of quantum fault tolerance—or even in restricted cases where quantum fault tolerance does not occur—the scope of certain physical laws may extend beyond what is strictly dictated by their mathematical derivations.
Robert Alicki, in his first critical paper on quantum computers, relied on the time-energy uncertainty principle (TEUP). This principle was rigorously established by (Yakir) Aharonov, Massar, and Popescu for completely unknown Hamiltonians. However, (Dorit) Aharonov and Atya regarded TEUP as a misconception, since strong violations follow from Shor’s theorem—though these violations depend on the existence of quantum fault tolerance. In a world without quantum fault tolerance, the TEUP may hold in greater generality.
Another tension arises from the reversibility of noiseless quantum evolution, which allows complex quantum processes to be run both forward and backward in time. This seems, in itself, to conflict with thermodynamics. Alicki, along with several coauthors, has provided detailed thermodynamic arguments suggesting inherent limitations on quantum computation—arguments that effectively “kill” quantum information while preserving only robust classical information. These arguments, however, did not convince the broader physics community, as there were notable counterarguments, which Alicki himself addressed in his papers (and Dave Bacon in his blog). If an argument emerges demonstrating that quantum fault tolerance is fundamentally impossible—one that does not rely on thermodynamics—it could lend further justification to Alicki’s thermodynamic claims.
A third example of a similar nature is the no-cloning theorem. Mathematically, it is well established (and not difficult) that one cannot clone a completely unknown quantum state. However, for a known quantum state described by a quantum circuit, it seems possible to run the circuit multiple times to generate as many copies as needed—or is it? For complex quantum states, achieving this requires quantum fault tolerance. In a world without quantum fault tolerance, a stronger version of the no-cloning theorem may hold. (See the fourth prediction in this post.)
Finally, one might ask whether a world devoid of quantum fault tolerance also precludes the existence of stable non-Abelian anyons. While such a world would certainly rule out very stable topological qubits—and, in fact, any kind of highly stable qubit—the implications for non-Abelian anyons remain an open question. I raised this as a question in an earlier post, along with an idea for how to approach it.
Roadmap for the Debate on Quantum Computing (slightly expanded)
Here is a roadmap to the debate about quantum computers from my perspective. Skeptical claims are in blue, responses to these claims are in red. Points 1-10 represent skeptical claims made over the years, while points a-s are replies to these skeptical claims. Points (i)-(xii) are replies to some of these replies. Underlined claims were made by me. Green boldface indicates points of agreement. (We published a somewhat shorter version also as a separate post.)
-
The understanding of noise is a crucial ingredient of any model of computation. Noise may cause quantum computing to fail.
-
a) Noise is not inherent. There is no inherent “noise” in quantum mechanics.
-
(i) A realistic computational model must include also the modeling of errors.
-
-
b) Quantum error correction and quantum fault tolerance have the potential to deal successfully with noise and various forms of inaccuracies.
- c) Topological quantum computing gives an alternative avenue for quantum computing that does not depend on the circuit model.
-
-
General forms of quantum noise (in particular, correlated errors) will fail quantum error correction.
-
d) These types of general noise are not physical.
-
e) These types of general noise would cause classical computation to fail as well.
-
f) For arbitrary forms of noise, if the error rate is sufficiently small, log-depth quantum computing survives.
-
-
Specific physical noise models differ from the ideal model.
-
g) We will be able to extend the threshold theorem to these models as well.
-
(ii) All those extensions still assume common properties that might be unrealistic.
-
-
-
There will be a systematic relationship between the “signal” and the noise. For example, the noise for a pair of entangled qubits will be highly correlated. (This argument extends to attempts to build quantum gadgets needed for topological quantum computing.)
-
h) This cannot be: how would nature know what the intended state is? It is like saying, “wet roads cause rain to pour.”
-
(iii) This systematic relation can be a general rule if you consider special-purpose quantum computational devices.
-
(iv) This systematic relation holds for NISQ (noisy intermediate scale quantum) systems and more generally for systems that do not apply quantum fault tolerance. This systematic relation describes quantum physics “above the quantum fault tolerance threshold.”
-
-
i) Such a principle would harm classical computation as well, unless you believe that by a miracle the threshold for fault tolerance lies between what is required for classical and quantum computation.
-
(v) The argument about the miraculous location of the threshold is incorrect because classical computers are not universal quantum computers.
-
-
-
NISQ computers represent a primitive classical computational power (LDP), which implies an inherent reason they cannot achieve quantum computational supremacy. This suggests there is an absolute lower bound on noise rates that prevents effective quantum error correction.
-
j) It is incorrect to apply asymptotic insights (on the number of qubits) to systems with only tens or a few hundred qubits.
-
(vi) We commonly apply asymptotic insights to small systems.
-
-
k) But why classical information and computation are possible?
-
(vii) The computational class LDP still supports classical information.
-
-
l) The Google 2019 experiment refutes this argument.
-
(Agreed upon) Google’s 2019 supremacy claims appear to be incorrect.
-
-
m) Some aspects of the Google 2019 experiment and its replications are promising and support basic assumptions about quantum noise.
-
(viii) These aspects of Google 2019 appear to be incorrect as well.
-
-
n) (New) The 2019 Google experiment was long ago superseded. Several recent experiments, including Google’s “septillion years advantage over classical” and other results from IBM, Quantinuum, QuEra, and USTC, are more robust.
-
(ix) We need to scrutinize these assertions as well. Gradual experimental progress is expected to hit a barrier, and fantastical experimental claims are expected to be falsified.
-
-
-
Samples from NISQ computers will include a large noise sensitive component. This means that the empirical distribution is inherently non-stationary and unpredictable.
-
o) The samples may still express quantum computational supremacy. It is just that we cannot harness or use this supremacy.
-
(x) Each bitsring in the sample (and even each bit) is a function of many temporal noise parameters; There is no reason to think that this function represent unusual computational power and there is no computational supremacy hiding in this picture.
-
-
-
Building quantum computers is a ridiculously hard engineering task.
-
p) It is very hard but not ridiculously hard; progress is being made, and a long-term approach is necessary.
-
(xi) Even achieving near-term tasks is inherently impossible.
-
(Agreed upon) We may have an opportunity to test this matter in the next few years.
-
-
-
(Simplified) Quantum computations conflict with important principles and laws from physics and computing theory:
-
7.1 The time-energy uncertainty principle.
-
7.2 The effective Church-Turing thesis.
-
7.3 Several principles of thermodynamics.
-
7.4 Bohr’s correspondence principle.
-
q) (Simplified) These physical and computational rules are not universally correct, which is part of what makes quantum computation so remarkable as it would enable novel physical and computational behaviors.
-
- Quantum computers will allow sampling according to the exact values of permanents. Achieving it is implausible given that computing permanents is #P complete.
-
r) Yes!! This is the computational trade mark twist of Nature.
-
- Quantum computers will (likely) enable to compute efficiently on a quantum computer the nth digit of the fine structure constant for large values of n where it has no physical meaning, as easy as we compute the digits of e and π with today’s digital computers.
-
s) Yeah, why not?
-
Summary
Proponents of Quantum Computers: BQP is the computational complexity class representing our quantum physical world. BQP characterizes the complexity of quantum processes on a microscopic scale, as evidenced by the challenges of simulating quantum physics on classical computers. Building quantum computers is primarily an engineering task, and pursuing this endeavor will drive exciting scientific progress across various fields. Many promising avenues exist, ranging from different implementations of quantum circuits to non-abelian anyons and topological quantum computing. Numerous impressive experimental results have already been achieved.
Skeptics of Quantum Computers: BPP is the computational class representing computation in the physical world. Classical information and computation are sustained in our noisy quantum world through the “averaging” of noisy quantum information. Quantum computational supremacy or high quality quantum error correction are inherently impossible and there is no compelling evidence for them in natural quantum processes. Advances in experimental quantum computing should be scrutinized carefully. Gradual experimental progress is expected to hit a barrier, and fantastical experimental claims are unlikely to withstand rigorous examination.
A “meta claim” of proponents is that there is no symmetry between the two sides of the debate: the prevalent view shared by many experts is that quantum computation is possible.