‘A truly remarkable breakthrough’: Google’s new quantum chip achieves accuracy milestone
Error-correction feat shows quantum computers will get more accurate as they grow larger
Researchers at Google have built a chip that has enabled them to demonstrate the first ‘below threshold’ quantum calculations — a key milestone in the quest to build quantum computers that are accurate enough to be useful.
The experiment, described on 9 December in Nature1, shows that with the right error-correction techniques, quantum computers can perform calculations with increasing accuracy as they are scaled up — with the rate of this improvement exceeding a crucial threshold. Current quantum computers are too small and too error-prone for most commercial or scientific applications.
“This has been a goal for 30 years,” said Michael Newman, a research scientist at Google’s headquarters in Mountain View, California, at a press conference announcing the feat. The achievement means that by the end of the decade, quantum computers could enable scientific discoveries that are impossible even with the most powerful supercomputers imaginable, said Charina Chou, the chief operating officer of Google’s quantum-computing arm. “That’s the reason we’re building these things in the first place,” Newman added.
“This work shows a truly remarkable technological breakthrough,” says Chao-Yang Lu, a quantum physicist at the University of Science and Technology of China in Shanghai.
Delicate states
Quantum computers encode information in states that can represent a 0 or a 1 — like the bits of ordinary computers — but can also use infinite possible combinations of several 0s and 1s. However, these quantum-information states are notoriously delicate, explains Julian Kelly, a physicist at Google who leads the company’s quantum-hardware division. To get a quantum computer to perform useful calculations, “you need quantum information, and you need to protect it from the environment — and from ourselves, as we do manipulations on it”, he says.
Aiming for such protection — without which quantum computing would be a non-starter — theoreticians began in 1995 to develop ingenious schemes for spreading one qubit of information across multiple ‘physical’ qubits. The resulting ‘logical qubit’ is resilient to noise — at least on paper. For this technique, called quantum error correction, to work in practice, it would be necessary to show that this spreading of information over multiple qubits robustly lowered error rates.
Over the past few years, several companies — including IBM and Amazon’s AWS — and academic groups have shown that error correction can produce small improvements in accuracy2,3,4. Google published a result in early 2023 using 49 qubits in its Sycamore quantum processor, which encodes each physical qubit in a superconducting circuit.
The company’s new chip, called Willow, is a larger, improved version of that technology, with 105 physical qubits. It was developed in a fabrication laboratory that Google built at its quantum-computing campus in Santa Barbara, California, in 2021.
As a first demonstration of Willow’s power, the researchers showed that it could perform, in roughly 5 minutes, a task that would take the world’s largest supercomputer an estimated 1025 years, says Hartmut Neven, who heads Google’s quantum-computing division. This is the latest salvo in the race to show that quantum computers have an advantage over classical ones.
And, by creating logical qubits inside Willow, the Google team has shown that each successive increase in the size of a logical qubit cuts the error rate in half.
“This is a very impressive demonstration of solidly being below threshold,” says Barbara Terhal, a specialist in quantum error correction at the Delft University of Technology in the Netherlands. Mikhail Lukin, a physicist at Harvard University in Cambridge, Massachusetts, adds, “It clearly shows that the idea works.”
Quantum-computing endgame
Kelly says the team’s results also suggest that this rate of improvement is sustainable, and that it will enable future quantum chips to reach rates of one error per 10 million steps. That’s the level of accuracy that researchers generally see as crucial for making quantum computers commercially useful. “Error correction is the endgame for quantum computing,” he says. “This is the quantum computer everyone’s imagined using.”
Achieving such a low error rate will require each logical qubit to be made of around 1,000 physical qubits, the company estimates — although further improvements in error-correction techniques could bring that overhead down, perhaps as low as 200 qubits, Newman says. Researchers at IBM and other laboratories have also been making dramatic progress with schemes that require fewer qubits2. This shows that the field of quantum computing is reaching a critical juncture, says Lukin. “It’s really an exciting time.”
Nevertheless, challenges remain, says Terhal. In addition to building robust logical qubits, researchers will need to network many logical qubits together so that they can share and exchange quantum states.
John Preskill, a theoretical physicist at the California Institute of Technology in Pasadena who helped to develop the theory of quantum error correction, says showing that error-correction schemes help to preserve the information in qubits was an important step, but that correcting computational errors will be even more important. “We want to do protected qubit operations, not just memory,” says Preskill, who also collaborates with the AWS quantum-computing experiments.
doi: https://doi.org/10.1038/d41586-024-04028-3
Additional reporting by Elizabeth Gibney.
This story originally appeared on: Nature - Author:Davide Castelvecchi