Timon Harz
December 12, 2024
Willow: How Google wants to improve the performance of quantum computers with a chip
Quantum computers are the future, but so far they are hardly suitable for practical use because they have too high an error rate. A special chip from Google is set to change this.

Google says it has taken a decisive step towards overcoming one of the biggest challenges in quantum computing. The new special chip ‘Willow’ and a new application method have paved the way for the development of practically usable quantum computers, said German computer scientist Hartmut Neven, founder and head of Google's Quantum Artificial Intelligence Laboratory.
Error rates below threshold
In the scientific journal Nature, Neven and his team report that quantum error correction with error rates below a relevant threshold has been achieved for the first time. Error correction is crucial for the development of scalable and applicable quantum computers.
Quantum computers can solve mathematical problems much faster than previous computers, for example when encrypting data, in materials research or in machine learning for artificial intelligence applications. However, the systems that have already been developed are too small and make too many errors to provide any added value. Another problem is that the error rate increases with additional computing units ("qubits").
Bundling the computing units
To get this problem under control, the Google team combined several error-prone physical qubits into a less error-prone logical qubit. To demonstrate this connection, the researchers used the newly developed quantum processor "Willow".
Neven and his team emphasize that the method used and the new chip make scalable, error-corrected quantum computers possible. However, the researchers also note that the error rate achieved is still not sufficient for a usable quantum computer. The team expects that they would need significantly more physical qubits to achieve satisfactory rates. Using more qubits with the method used will also lead to longer computing times.
"Work meets high standards"
Markus Müller, Professor of Theoretical Quantum Technology at the RWTH Aachen University, explained that the Google team had succeeded for the first time experimentally in demonstrating quantum error correction well below the critical error thresholds and with a method that is in principle scalable. "The work methodologically meets the high standards that are usual in the research field."
Michael Hartmann, Professor of Theoretical Physics at the Friedrich-Alexander University Erlangen-Nuremberg, also praised the scientific quality of the work. "The outlook given is not unfounded." It should be noted, however, that the authors have made error-tolerant computing conditional on the results being able to be scaled to significantly larger numbers of qubits.
"Still a long way to go"
"With the current quality of qubits, 100,000 to a million qubits will be needed to be able to carry out large, fault-tolerant calculations that are beyond the capabilities of classical supercomputers," wrote Hartmann in the Science Media Center (SMC). The current work presents results from a chip with 105 qubits. "This shows how far we still have to go."
Press contact
Timon Harz
oneboardhq@outlook.com
Other posts
Company
About
Blog
Careers
Press
Legal
Privacy
Terms
Security