Encryption timelines shorten as two groups cut qubit requirements for Shor's algorithm
Caltech spinout Oratomic claims tens of thousands of qubits could break encryption, while Google's 10x more efficient Shor implementation compounds the urgency. University of Sydney publishes low-overhead error correction reducing qubit overhead for fault-tolerant computing.
The gap between theoretical quantum attacks on encryption and real hardware capability closed further this week. Two independent groups announced algorithmic improvements that reduce the qubit requirements for running Shor’s algorithm at cryptographically relevant scale. Separately, a University of Sydney researcher published a new error correction approach in Nature Physics that reduces the physical qubit overhead needed for fault-tolerant computation. Taken together, these results reinforce the message from Google’s own whitepaper last week: the 2029 migration deadline for post-quantum cryptography is not conservative.
Caltech and Google: the encryption threat narrows
Quanta Magazine reported today on two parallel advances that reduce the qubits needed to break elliptic curve encryption using Shor’s algorithm.
The Caltech group, led by physicist Dolev Bluvstein and collaborator Madelyn Cain, published a design showing that a quantum computer with tens of thousands of qubits could break common encryption. Bluvstein has since founded a company, Oratomic, to build the machine. The Caltech approach uses reconfigurable neutral-atom qubits, consistent with Google’s own dual-modality strategy reported earlier this week. The paper is on arXiv (preprint 2603.28627).
The context matters here. Prior estimates for running Shor’s algorithm at cryptographic scale sat around 1 million qubits, itself a dramatic reduction from earlier estimates in the billions. The Caltech result pushes that number down to tens of thousands, bringing it closer to hardware that exists today or will exist within a few years. Neither Oratomic nor Google has hardware capable of breaking encryption right now. But the trajectory is clear.
Google’s contribution is equally significant: a 10x more efficient implementation of Shor’s algorithm, detailed in the same whitepaper the company released last week targeting cryptocurrency. A 10x efficiency gain in the algorithm means the hardware requirements drop by roughly the same factor, further compressing the timeline.
The practical implication for security teams has not changed from last week but is now better quantified: the window to migrate cryptographic infrastructure to post-quantum standards is measured in years, not decades. Organisations protecting data with long sensitivity lifespans (medical records, legal documents, state secrets, financial contracts) face the most immediate risk from harvest-now-decrypt-later attacks, where adversaries collect encrypted data today and decrypt it once quantum hardware matures.
University of Sydney: lower qubit overhead for error correction
A complementary development from the University of Sydney reinforces the hardware side of the picture. Dr. Dominic Williamson and collaborator Theodore Yoder published “Low-overhead fault-tolerant quantum computation by gauging logical operators” in Nature Physics. The work presents a new approach to quantum error correction that reduces the number of physical qubits required to encode a single logical qubit.
Error correction is the central bottleneck in building large-scale quantum computers. Current approaches require hundreds to thousands of physical qubits to reliably represent a single logical qubit. Williamson’s technique reduces this overhead by using gauge symmetry to simplify the correction operations, cutting the physical qubit cost without sacrificing fault tolerance.
The significance for near-term hardware is indirect but real. Every reduction in the physical-to-logical qubit ratio means a machine with a given qubit count can perform more useful computation. Combined with the Caltech and Google algorithmic improvements, the path to cryptographically relevant machines shortens from two ends simultaneously: the algorithm needs fewer qubits, and each qubit can now be more efficiently error-corrected.
This is an Australian research result worth tracking closely. The University of Sydney has a strong quantum computing research group, and results like this feed directly into the broader fault-tolerant roadmap that IBM, Google, and IonQ are all executing against.
What this means for planning
The convergence of these results in a single week is notable. The cryptographic threat is not academic. Google filed a responsible disclosure, Caltech formed a company, and Nature Physics published improved error correction, all within days of each other.
For technology leaders the action items have not changed, but the urgency has increased:
- Complete a cryptographic asset inventory before the end of 2026
- Identify any systems with data sensitivity lifespans beyond 5 years
- Begin evaluating NIST-standardised PQC algorithms (ML-KEM, ML-DSA, SLH-DSA) for those systems
- Execute DPAs with all vendors processing sensitive data to ensure they have PQC migration plans
Sources and Further Reading
Primary sources:
- Quanta Magazine coverage of Caltech and Google advances: quantamagazine.org
- Caltech/Oratomic preprint: arXiv 2603.28627
- University of Sydney error correction paper: phys.org
- The Quantum Insider on Sydney result: thequantuminsider.com
Related Quantum Brief coverage:
- Google warns crypto on quantum risk ahead of 2029 timeline — the Google whitepaper that set the 2029 deadline
- Willow access and 10,000-qubit potential push — Caltech’s earlier theoretical result on qubit requirements