
For years, quantum computers have seemed like a futuristic technology: on paper, they can solve problems that classical machines can’t handle in a reasonable time, but in reality they constantly stumble over the same problem: errors .
Now, Chinese researchers have announced that they have taken a step that many consider a turning point for the entire field : they have crossed a crucial threshold where error correction actually improves system reliability, rather than destroying it.
A team led by Jian-Wei Pan of the University of Science and Technology of China has reported that their superconducting quantum computer, Zuchongzhi 3.2 , has reached the so-called fault-tolerance threshold .
This is the long-awaited moment: adding error-correction procedures makes calculations more robust as the system grows. Previously, quantum error correction often turned into a paradox: the more checks and additional qubits were added, the more new sources of error appeared, and the overall reliability could only worsen.
The results were published in the journal Physical Review Letters . Importantly, the Chinese team relied on microwave control rather than the more hardware-based approach used by Google in its demonstration.
This approach aims to combat one of the most problematic types of failures, so-called leakage errors, when a qubit “leaks” from its operational state and begins silently spreading problems throughout the circuit. According to the authors, their method of completely eliminating leakage using microwave pulses could prove more scalable because it requires less complex control circuitry and ultra-low-temperature wiring.
To understand the importance of this aspect, it’s worth recalling the fundamental problem of quantum computing. Qubits are extremely sensitive to heat, noise, and any microperturbations , so errors constantly arise, even during ” normal ” operation. Quantum error correction solves this problem differently than classical computers: it distributes information across many qubits and repeatedly measures specific control values, attempting to identify errors without compromising the computation itself. But each new qubit and each additional measurement also introduces errors, so for a long time, systems found themselves in a situation where the “cure” was worse than the “disease.”
This is precisely why everything hinges on the error-correction threshold. If the underlying qubits are too noisy, any attempt to correct them only increases the overall error rate. If, however, the underlying errors can be reduced below a certain level, the balance shifts: the correction begins to prevail, and increasing the security code leads to a reduction in the resulting error rate . In recent years, both China and the United States have actively invested in one of the most studied approaches, surface coding. In 2022, Pan’s group demonstrated the smallest correction element, the logical qubit at distance 3, and Google later moved on to distance 5. However, in both cases, the noise in the “physical” qubits remained too high to confidently declare that the threshold had been exceeded.
This changed after Google announced progress on the Willow processor in February of this year, demonstrating a subthreshold logical-7 qubit by suppressing leakage via DC pulses. A Chinese study now claims to have achieved comparable scale on its own platform, but using a different approach. Using the 107-qubit Zuchongzhi 3.2 processor , the researchers implemented a fully microwave-based leakage suppression scheme and combined it with a surface code, producing a logical-7 qubit. In their experiment, they observed a key sign of “correct” mode: as the code size increased, the overall error decreased rather than increased.
The authors cite an error suppression ratio of 1.4, meaning that each increment of the correction scale produced a significant gain in reliability. External experts also praised the result, but with reservations. Physicist Joseph Emerson of the University of Waterloo, who was not involved in the work, called the experiment impressive and emphasized that it addresses one of the most challenging problems in quantum computing: error dispersion due to qubit drift. However, he also noted that practical applications are still far off: current demonstrations remain on a scale incomparable to that required for real-world problems.
However, the news has a different meaning : the competition for fault-tolerant quantum computers is no longer a “race of promises,” but increasingly an engineering battle for scalability . The Chinese team believes that the microwave approach could solve two of the main problems of future quantum systems: wiring complexity and hardware overhead. If this idea is successfully developed, the path to machines with hundreds of thousands and even millions of qubits will seem a little less far-fetched than it did yesterday.
Follow us on Google News to receive daily updates on cybersecurity. Contact us if you would like to report news, insights or content for publication.
