Google's Willow Quantum Chip
Quantum Computing
October 26, 2025
Google’s Willow chip signals the start of practical quantum computing—driven not by headline speedups, but by scalable error correction.

Google's Willow Quantum Chip

Introduction: The Septillion-Year Shortcut That Isn't the Real Story

Recent headlines declared that Google's new "Willow" quantum chip performed a complex calculation in under five minutes that would take today's fastest supercomputer an estimated 10 septillion years to complete. The number is staggering, a mind-bending shortcut that sounds like a leap into science fiction. But while these benchmarks grab attention, they obscure a far more important and revolutionary story unfolding in quantum computing. The truth is more nuanced, surprising, and ultimately more critical for the future of science and technology. Behind the hype are fundamental shifts in both hardware and software that are setting the stage for a genuine transformation. Here are the five most impactful, and often counter-intuitive, truths hidden behind the headlines.

--------------------------------------------------------------------------------

1. The Most Famous Quantum Benchmark is "Totally Useless"

The task used to generate the "10 septillion year" headline is a benchmark called Random Circuit Sampling (RCS). In simple terms, RCS is a highly specific, abstract problem designed to be exceptionally difficult for classical computers while remaining relatively straightforward for a quantum computer. It is a tailor-made test to showcase a quantum processor's unique strengths.

However, beyond serving as a performance metric, this benchmark has no practical application. So while the media focuses on a septillion-year speedup, the surprising truth is that it's a speedup on a problem with no intrinsic value. It doesn't solve a real-world problem or provide insight into a scientific mystery. Experts in the field are direct about its limitations:

"It is a problem that is cherry-picked for a quantum computer to shine, a way to quantify performance without offering a practical solution to anything anyone really cares about, except random number generation, which has certain niche applications."

— Daniel Lidar, University of Southern California (as quoted in Engineering)

"It is totally useless. The only point is to prove, very indirectly, that your quantum computer is plugged in."

— Chris Monroe, Duke University (as quoted in Engineering)

These benchmarks are also a moving target. The previous "quantum supremacy" milestone from 2019, which Google initially claimed would require a supercomputer 10,000 years to simulate, can now be replicated classically in just 6.18 seconds on a modern supercomputer, thanks to the development of more sophisticated simulation algorithms. This illustrates that while these tests are valuable for measuring progress, they are not the true story of the quantum revolution. If these flashy benchmarks aren't the real story, then what is? The true breakthrough in the Willow experiment was far less publicized, but infinitely more important.

--------------------------------------------------------------------------------

2. The Real Breakthrough is "Boring" (But Changes Everything)

The most significant achievement of the Willow chip was not the RCS speedup, but a far less flashy milestone: the demonstration of "below-threshold" quantum error correction (QEC) at scale. While the headlines focused on a septillion-year shortcut on a useless problem, this quiet achievement in error correction is what actually makes a useful quantum computer plausible.

Traditionally, the fragile nature of qubits meant that adding more of them to a system would inevitably increase the overall error rate. "Below-threshold" QEC flips this dynamic on its head. For the first time at this scale, adding more physical qubits to protect a single "logical qubit" actually decreased the error rate. In the experiment, the error rate dropped by a factor of about 2.14 each time the lattice of physical qubits was scaled up, from a 3x3 grid to a 5x5 grid and then to a 7x7 grid.

The importance of this cannot be overstated. It is the first concrete experimental evidence of a scalable path toward building fault-tolerant quantum computers—machines that become more reliable as they grow larger. This achievement, not the abstract benchmark, is the true milestone on the road to practical, world-changing quantum computing. But building fault-tolerant hardware is only half the battle. Just as critically, the software that will run on these machines is advancing at a breathtaking pace of its own.

--------------------------------------------------------------------------------

3. Software is Eating the Quantum World, Too

While hardware progress like the Willow chip captures the public imagination, the less-visible but equally explosive progress in quantum algorithms reveals another surprising truth. A recent analysis from the National Energy Research Scientific Computing Center (NERSC) highlights just how dramatic these software-driven gains are.

The report presents a case study on calculating the ground state energy of the FeMoco molecule, a key problem in quantum chemistry that is notoriously difficult for classical computers. Over the past five years alone, constant-factor improvements in quantum algorithms—not hardware—have reduced the resources needed to solve this problem by "orders of magnitude." Specifically, this has led to a roughly 1000-fold reduction in the number of quantum gates required.

This 1000x algorithmic improvement means that quantum advantage for a problem like FeMoco could arrive years earlier and on far less advanced hardware than the roadmaps alone might suggest. This is profoundly impactful because the journey to quantum advantage is a race on two fronts. Smarter, more efficient software can deliver massive performance gains independently of hardware improvements, making challenging problems feasible on less powerful machines than previously believed. This parallel race between hardware and software highlights a final, crucial dependency: the massive amount of classical computing power required to make it all work.

--------------------------------------------------------------------------------

4. Your Quantum Computer Needs a Classical Supercomputer on the Side

The popular image of a quantum computer is that it will replace supercomputers. The surprising reality is that it will be completely dependent on them, requiring a high-performance classical computer just to keep its own errors in check.

The Google Willow experiment provides a stark example. This immense classical horsepower wasn't for running the quantum algorithm itself, but for the "boring" but essential task of decoding the error information from the quantum error correction running on the chip. To perform this decoding, the researchers used a classical computer running a 64-core AMD Ryzen Threadripper PRO 5995WX processor. Even with this power, a surprising limitation emerged: the error decoding for the most complex part of the experiment (the distance-7 surface code) could not be performed in real-time.

This underscores a fundamental truth about the future of high-performance computing. It will not be a choice between classical and quantum. Instead, it will be a deeply integrated, hybrid model where classical supercomputers and quantum processors work in tandem, each handling the tasks for which they are best suited.

--------------------------------------------------------------------------------

5. Science's Biggest Problems Are on a Collision Course with Quantum's Rise

The demand for quantum-scale computation already exists, and it is massive. According to the NERSC report, "At least 50% of NERSC compute resources are spent on solving quantum mechanical problems relevant to materials science, quantum chemistry, and nuclear and high energy physics." This isn't a story about searching for a problem to solve; it's about an existing, massive scientific workload on a collision course with a new form of computing.

Juxtapose this immense need with the projections from quantum hardware vendors. A consolidated analysis of their public roadmaps predicts an "exponential increase in quantum computer performance over the next decade by up to nine orders of magnitude." When these two trends—the needs of science and the capabilities of hardware—are plotted, a clear picture emerges. The NERSC report concludes that a "significant overlap" between the hardware capabilities promised by vendors and the resources required by scientists is expected to emerge within the next five to ten years.

This signals a major inflection point. The era of quantum computing as a tool for purely theoretical exploration is drawing to a close. We are entering a decade where these machines are expected to begin tackling a huge fraction of today's most challenging and important scientific workloads.

--------------------------------------------------------------------------------

Conclusion: Beyond If, to When and How

The true story of quantum progress is more complex and far more profound than the headline-grabbing benchmarks suggest. As the hardware capabilities and algorithmic requirements finally begin to converge, the critical question is no longer if quantum computers will impact science, but how profoundly—and what new challenges will emerge when theory finally becomes practice?

Eamonn Darcy
AI Technical Director
Sources:

Primary Report Identification:

Title: Quantum Computing Technology Roadmaps and Capability Assessment for Scientific Computing - An analysis of use cases from the NERSC workload.

Authors: The named authors are Daan Camps, Ermal Rrapaj, Katherine Klymko, Hyeongjin Kim, Kevin Gott, Siva Darbha, Jan Balewski, Brian Austin, and Nicholas J. Wright. The document also lists the authors as Camps, Daan; Rrapaj, Ermal; and Klymko, Katherine et al.

Publication Date: September 11, 2025.

Affiliations: Lawrence Berkeley National Laboratory (LBNL) and the National Energy Research Scientific Computing Center (NERSC).

Report Numbers and Links: The report is identified by Permalink https://escholarship.org/uc/item/0xb5w33t and Report No. LBNL-2001699.

Copyright: The work is made available under the terms of a Creative Commons Attribution-NonCommercial-ShareAlike License.

Sponsorship and Disclaimer:

• The report explicitly states that it was prepared as an account of work sponsored by an agency of the United States Government.

• The research was specifically supported by the U.S. Department of Energy (DOE) under Contract No. DEAC02-05CH11231, through the National Energy Research Scientific Computing Center (NERSC).

• The disclaimer notes that the views and opinions expressed by the authors do not necessarily state or reflect those of the United States Government or any agency thereof.

Internal Reference Material:

• The comprehensive collection of academic and technical literature consulted for the document is contained within the Bibliography section, which begins on page 52. This section includes sources covering quantum algorithms, resource estimates, and vendor roadmaps.