New quantum computer smashes 'quantum supremacy' record by a factor of 100 — and it consumes 30,000 times less power - think of how many AI anime titties we could generate

New quantum computer smashes 'quantum supremacy' record by a factor of 100 — and it consumes 30,000 times less power​

The 56-qubit H2-1 computer has broken the previous record in the 'quantum supremacy' benchmark first set by Google in 2019.​

By Keumars Afifi-Sabet published [July 11th, 2024]

2zgWGRmNZs6aTM4JcUWUSJ-650-80.jpg.png
Scientists acheived an XMB score of 0.35, which means the H2 quantum computer can produce results without producing an error 35% of the time (Image credit: credit Quantinuum)

A new quantum computer has broken a world record in "quantum supremacy," topping the performance of benchmarking set by Google's Sycamore machine by 100-fold.

Using the new 56-qubit H2-1 computer, scientists at quantum computing company Quantinuum ran various experiments to benchmark the machine's performance levels and the quality of the qubits used. They published their results June 4 in a study uploaded to the preprint database arXiv. The study has not been peer-reviewed yet.

To demonstrate the potential of the quantum computer, the scientists at Quantinuum used a well-known algorithm to measure how noisy, or error-prone, qubits were.

Quantum computers can perform calculations in parallel thanks to the laws of quantum mechanics and entanglement between qubits, meaning the fates of different qubits can instantly change each other. Classical computers, by contrast, can work only in sequence.

Adding more qubits to a system also scales up the power of a machine exponentially; scientists predict that quantum computers will one day perform complex calculations in seconds that a classical supercomputer would have taken thousands of years to solve.

The point where quantum computers overtake classical ones is known as "quantum supremacy," but achieving this milestone in a practical way would need a quantum computer with millions of qubits. The largest machine today has only about 1,000 qubits.

The reason we would need so many qubits for "quantum supremacy" is that they are inherently prone to error, so many would be needed to correct those errors. That's why many researchers are now focusing on building more reliable qubits, rather than simply adding more qubits to machines.

dLU8dE3czXXV56NEfeY5fJ-970-80.png
When quantum computers overtake classical ones is known as "quantum supremacy," but achieving this milestone would need a quantum computer with millions of qubits. (Image credit: credit Quantinuum)

The team tested the fidelity of H2-1's output using what's known as the linear cross entropy benchmark (XEB). XEB spits out results between 0 (none of the output is error-free) and 1 (completely error-free), Quantinuum representatives said in a statement.

Scientists at Google first tested the company's Sycamore quantum computer using XEB in 2019, demonstrating that it could complete a calculation in 200 seconds that would have taken the most powerful supercomputer at the time 10,000 years to finish. They registered an XEB result of approximately 0.002 with the 53 superconducting qubits built into Sycamore.

But in the new study, Quantinuum scientists — in partnership with JPMorgan, Caltech and Argonne National Laboratory — achieved an XEB score of approximately 0.35. This means the H2 quantum computer can produce results without producing an error 35% of the time.

"We are entirely focused on the path to universal fault tolerant quantum computers," Ilyas Khan, chief product officer at Quantinuum and founder of Cambridge Quantum Computing, said in the statement. "This objective has not changed, but what has changed in the past few months is clear evidence of the advances that have been made possible due to the work and the investment that has been made over many, many years."

Quantinuum previously collaborated with Microsoft to demonstrate "logical qubits" that had an error rate 800 times lower than physical qubits.

In the study, published in April, scientists demonstrated they could run experiments with the logical qubits with an error rate of just 1 in 100,000 — which is much stronger than the 1-in-100 error rate of physical qubits, Microsoft representatives said.

"These results show that whilst the full benefits of fault tolerant quantum computers have not changed in nature, they may be reachable earlier than was originally expected," added Khan.
 
Light chips baby. Let’s fucking gooo
as optimistic as I am, contemporary electronics developments have been wasted on rendering higher fidelity niggers, harvesting personal information from niggers and mining niggertokens and I can only assume photonics/spintronics will be wasted in a similar manner

which is absurd, in a cosmic sense.
 
The reason we would need so many qubits for "quantum supremacy" is that they are inherently prone to error, so many would be needed to correct those errors. That's why many researchers are now focusing on building more reliable qubits, rather than simply adding more qubits to machines.
Once this gets figured out, it could be relatively simple to scale physical qubit counts to the millions or billions (but a lower amount of "logical qubits"), using the same lithography techniques and copypastaing that allow classical computers to have tens to hundreds of billions of transistors.
 
Once this gets figured out, it could be relatively simple to scale physical qubit counts to the millions or billions (but a lower amount of "logical qubits"), using the same lithography techniques and copypastaing that allow classical computers to have tens to hundreds of billions of transistors.
Moore’s law is for pussies.
 
Photonics and quantum computing are going to be big fields. Any Kiwis with children with scientific and mathematical talents may consider a gentle nudge in these directions, where there will be real discoveries and fortunes made in 15-30 years. The first 15 years of transistors existing didn't make big fortunes, they had to be sponsored by companies with a ton of funding, like Bell Labs, to even exist. They were not profitable.

The time from the first transistors to the earliest personal computers that turned into enduring brands was roughly 30 years. We're looking at a similar timescale here. It would have been a good long-term idea if you were a parent of a bookish, math-inclined child in 1955 to guide them toward this weird "computer" stuff you'd been hearing about.
 
Just linking this in as well:
Thread 'Multiple nations enact mysterious export controls on quantum computers'
https://kiwifarms.net/threads/multi...-export-controls-on-quantum-computers.195350/
People have gotten way ahead of themselves with AI and ML. Quantum and biological computing are well-underway with their development and will be here first, which will facilitate AI.

But for a bit we will have these non-AI systems of immense power. And during that time you will see amazing things: namely encryption/blockchain/et cetera will be totally breached and the world will once again rethink security data measures. It will go back to having to put a man in the room that houses the data in order to steal it.

Hope you saved your typewriters.
 
People have gotten way ahead of themselves with AI and ML. Quantum and biological computing are well-underway with their development and will be here first, which will facilitate AI.

But for a bit we will have these non-AI systems of immense power. And during that time you will see amazing things: namely encryption/blockchain/et cetera will be totally breached and the world will once again rethink security data measures. It will go back to having to put a man in the room that houses the data in order to steal it.

Hope you saved your typewriters.

Just because you can use a quantum computer to break encryption does not mean it is without cost in terms of computer or energy resources.

This technology will be so expensive and have such significant technological limits in its first 10-20 years that it will be used in a very limited way. In that time, I imagine the arms race for post-quantum encryption will be geared up and there will be ways to migrate to post-quantum safe protocols. And so on, until the next big thing. I of course refer to singularity engines, which Sid Meier informed me long ago would be the followup to quantum ones.
 
People have gotten way ahead of themselves with AI and ML. Quantum and biological computing are well-underway with their development and will be here first, which will facilitate AI.

But for a bit we will have these non-AI systems of immense power. And during that time you will see amazing things: namely encryption/blockchain/et cetera will be totally breached and the world will once again rethink security data measures. It will go back to having to put a man in the room that houses the data in order to steal it.

Hope you saved your typewriters.
There are plenty of quantum resistant encryption schemes out there. For AES it's a simple matter of using 256 bit keys. Public key is harder, not so much in terms of the technology, but just upgrading standards that are deployed everywhere. I imagine TLS 1.5 or so will probably start introducing quantum resistant algorithms... but honestly TLS is hilarious easy to break for anybody who can afford a quantum computer, so I almost don't care.
 
So have they actually written any software for these things or are is still in the 'it works!" stage?
From my understanding most likely. It seems feasibly possible to implement programmable logic given the mechanism by which encoding is done. I don’t know for sure. Given the state of Silicon Valley it could equally be hot air. It would be an impressive feat to translate particle state physics into embedded logic.
 
Back