Science Scientists just simulated the “impossible” — fault-tolerant quantum code cracked at last - A barrier in quantum computing broken by team creating an algorithm that lets normal computers simulate fault-tolerant bosonic code circuits.

ScienceDaily (archive.today)
4 Jul 2025 02:29:40 UTC

1751596600644.webp
Quantum computers can perform complex computations thanks to their ability to represent an enormous number of different states at the same time in a so-called quantum superposition. Representing these superpositions of states is incredibly difficult to describe. Now, a research team has found a relatively simple method to simulate some relevant quantum superpositions of states. The illustration shows one of these superpositions, which can be created inside what’s known as a continuous-variable quantum computer. The team was able to observe how these states change when they interact with each other, and they were also able to simulate those changes using wave-like patterns - like the ones you see in the image. Credit: Chalmers University of Technology I Cameron Calcluth


Quantum computers still face a major hurdle on their pathway to practical use cases: their limited ability to correct the arising computational errors. To develop truly reliable quantum computers, researchers must be able to simulate quantum computations using conventional computers to verify their correctness - a vital yet extraordinarily difficult task. Now, in a world-first, researchers from Chalmers University of Technology in Sweden, the University of Milan, the University of Granada, and the University of Tokyo have unveiled a method for simulating specific types of error-corrected quantum computations - a significant leap forward in the quest for robust quantum technologies.

Quantum computers have the potential to solve complex problems that no supercomputer today can handle. In the foreseeable future, quantum technology's computing power is expected to revolutionise fundamental ways of solving problems in medicine, energy, encryption, AI, and logistics.

Despite these promises, the technology faces a major challenge: the need for correcting the errors arising in a quantum computation. While conventional computers also experience errors, these can be quickly and reliably corrected using well-established techniques before they can cause problems. In contrast, quantum computers are subject to far more errors, which are additionally harder to detect and correct. Quantum systems are still not fault-tolerant and therefore not yet fully reliable.

To verify the accuracy of a quantum computation, researchers simulate - or mimic - the calculations using conventional computers. One particularly important type of quantum computation that researchers are therefore interested in simulating is one that can withstand disturbances and effectively correct errors. However, the immense complexity of quantum computations makes such simulations extremely demanding - so much so that, in some cases, even the world's best conventional supercomputer would take the age of the universe to reproduce the result.

Researchers from Chalmers University of Technology, the University of Milan, the University of Granada and the University of Tokyo have now become the first in the world to present a method for accurately simulating a certain type of quantum computation that is particularly suitable for error correction, but which thus far has been very difficult to simulate. The breakthrough tackles a long-standing challenge in quantum research.

"We have discovered a way to simulate a specific type of quantum computation where previous methods have not been effective. This means that we can now simulate quantum computations with an error correction code used for fault tolerance, which is crucial for being able to build better and more robust quantum computers in the future," says Cameron Calcluth, PhD in Applied Quantum Physics at Chalmers and first author of a study recently published in Physical Review Letters.

Error-correcting quantum computations - demanding yet crucial

The limited ability of quantum computers to correct errors stems from their fundamental building blocks - qubits - which have the potential for immense computational power but are also highly sensitive. The computational power of quantum computers relies on the quantum mechanical phenomenon of superposition, meaning qubits can simultaneously hold the values 1 and 0, as well as all intermediate states, in any combination. The computational capacity increases exponentially with each additional qubit, but the trade-off is their extreme susceptibility to disturbances.

"The slightest noise from the surroundings in the form of vibrations, electromagnetic radiation, or a change in temperature can cause the qubits to miscalculate or even lose their quantum state, their coherence, thereby also losing their capacity to continue calculating," says Calcluth.

To address this issue, error correction codes are used to distribute information across multiple subsystems, allowing errors to be detected and corrected without destroying the quantum information. One way is to encode the quantum information of a qubit into the multiple - possibly infinite - energy levels of a vibrating quantum mechanical system. This is called a bosonic code. However, simulating quantum computations with bosonic codes is particularly challenging because of the multiple energy levels, and researchers have been unable to reliably simulate them using conventional computers - until now.

New mathematical tool key in the researchers' solution

The method developed by the researchers consists of an algorithm capable of simulating quantum computations that use a type of bosonic code known as the Gottesman-Kitaev-Preskill (GKP) code. This code is commonly used in leading implementations of quantum computers.

"The way it stores quantum information makes it easier for quantum computers to correct errors, which in turn makes them less sensitive to noise and disturbances. Due to their deeply quantum mechanical nature, GKP codes have been extremely difficult to simulate using conventional computers. But now we have finally found a unique way to do this much more effectively than with previous methods," says Giulia Ferrini, Associate Professor of Applied Quantum Physics at Chalmers and co-author of the study.

The researchers managed to use the code in their algorithm by creating a new mathematical tool. Thanks to the new method, researchers can now more reliably test and validate a quantum computer's calculations.

"This opens up entirely new ways of simulating quantum computations that we have previously been unable to test but are crucial for being able to build stable and scalable quantum computers," says Ferrini.

More about the research

The article Classical simulation of circuits with realistic odd-dimensional Gottesman-Kitaev-Preskill states has been published in Physical Review Letters. The authors are Cameron Calcluth, Giulia Ferrini, Oliver Hahn, Juani Bermejo-Vega and Alessandro Ferraro. The researchers are active at Chalmers University of Technology, Sweden, the University of Milan, Italy, the University of Granada, Spain, and the University of Tokyo, Japan.

Story Source:
Materials provided by Chalmers University of Technology. Note: Content may be edited for style and length.

Journal Reference:
  1. Cameron Calcluth, Oliver Hahn, Juani Bermejo-Vega, Alessandro Ferraro, Giulia Ferrini. Classical Simulation of Circuits with Realistic Odd-Dimensional Gottesman-Kitaev-Preskill States. Physical Review Letters, 2025; 135 (1) DOI: 10.1103/xmtw-g54f
 
Wow that all sounds amazing to my ignorant ass! Surely someone who knows what this is talking about won’t come along and spoil it as a lark pushed by the media to aggrandize something and embellish for views!

Sorry dudes, been burned too many times to buy it anymore.
From my read, this will hopefully help make developing quantum computers easier, not some breakthrough with quantum computers themselves. Definitely useful, but not something that will directly impact anyone not in the field.
 
Sorry dudes, been burned too many times to buy it anymore.
I am as uneducated on this stuff as the next guy, but this is probably what people said about flying too back in the day. Quantum shit is crazy but great advancements take time.
 
  • Agree
Reactions: Not A Bunny
I am as uneducated on this stuff as the next guy, but this is probably what people said about flying too back in the day. Quantum shit is crazy but great advancements take time.
Yeah but science rags always put out these articles touting AMAZING DISCOVERIES and DOING THE IMPOSSIBLE like someone just resolved the way to unlock warp drives but then someone who actually knows a bit comes along and spoils the surprise that actually it isn’t that revolutionary and probably isn’t peer reviewed yet which will debunk everything claimed originally. For instance.

 
Wow that all sounds amazing to my ignorant ass! Surely someone who knows what this is talking about won’t come along and spoil it as a lark pushed by the media to aggrandize something and embellish for views!

Sorry dudes, been burned too many times to buy it anymore.

Pretty much. You will know exactly the moment that quantum computing becomes viable in a big way. The first thing anyone who cracks it will do is grab Satoshi’s bitcoin and start selling it off to become a billionaire.

And then crypto will crash like a lead balloon dumped from lower orbit.
 
Pretty much. You will know exactly the moment that quantum computing becomes viable in a big way. The first thing anyone who cracks it will do is grab Satoshi’s bitcoin and start selling it off to become a billionaire.

And then crypto will crash like a lead balloon dumped from lower orbit.
The average Joe will be almost completely indifferent to actual quantum computers. Their main uses are in the realms of science and cryptography. I'd be shocked if the average Joe could be arsed to care about it once they realize that it's not going to create a general intelligence AI or make their video games look better. It'll just be one more form of esoteric computing that only specialists have a reason to learn about.
 
Yeah, it can accurately guess the gibberish to effectively steal your your bitcoin, but can it reliably solve captcha and pinpoint which squares depict the mythical bicycle? The rim of the back tire is slightly in the corner there, does it count? Checkmate. you pathetic machine, cryptocurrency secured.
 
  • Like
Reactions: Nick Gars
Now show me one that doesn't require an entire AI model training datacenter's worth of support equipment and supercomputers to actually make it work and decipher a single output. Until you can do that, it's just the usual Pop-Sci puffery.
 
Alright I ran this through my super powerful AI and I can confirm that this breakthrough is huge, but will probably do nothing.
 
Yeah but science rags always put out these articles touting AMAZING DISCOVERIES and DOING THE IMPOSSIBLE like someone just resolved the way to unlock warp drives but then someone who actually knows a bit comes along and spoils the surprise that actually it isn’t that revolutionary and probably isn’t peer reviewed yet which will debunk everything claimed originally. For instance.

Bro you're comparing an article on Science Daily to some rag called space dot com, and an announcement from an application team coordinating across 4 prestigious universities with one by 2 fucking theorists at the University of Buffalo.

I mean, I get it, pop science media is the worst kind of hyperbolic cancer, but this did an admirable job discussing a breakthrough towards a pretty major part of what they'll need to verify the underlying error correction is working correctly. It gets a lot harder when bits can now exist in a superposition of states instead of binary on/off.
 
Last edited:
This is pretty big.. One of the last major 'practical' hurdles/questions around viability of the tech. Once it is fully solved, most of the remaining things are about scale and amount, (the part we are really good at) not a qouesion of "if" it can be done, just how long ill we can scale it up cheaply and easily enough.
 
This is pretty big.. One of the last major 'practical' hurdles/questions around viability of the tech. Once it is fully solved, most of the remaining things are about scale and amount, (the part we are really good at) not a qouesion of "if" it can be done, just how long ill we can scale it up cheaply and easily enough.
Decoherence is the other major, major problem they have to address, because it only amplifies on scaling up and limits how many operations you can chain together taking advantage of the qbits.
 
Decoherence is the other major, major problem they have to address, because it only amplifies on scaling up and limits how many operations you can chain together taking advantage of the qbits.

That's why i said one of. I was of the understanding that this (fault) was the issue that was thought to be the biggest most challenging question mark. But yeah. It's still just a step.. i really big step in fact.
 
Back