Remix.run Logo
analog31 6 days ago

It turns out that error correction was easy on digital computers, and was essentially a solved problem early in their development. In fact, "noise immunity" is arguably the defining feature of a digital system. And error correction can happen at each gate, since there's no reason to propagate an indeterminate number.

Der_Einzige 6 days ago | parent [-]

Except quantum error correction algorithms that are good don’t exist and probably theoretically never can exist: https://spectrum.ieee.org/the-case-against-quantum-computing

i7l 6 days ago | parent | next [-]

The current best one- and two-gate errors are well below 0.01% and going down with every generation of chips. See: https://ianreppel.org/quantum.html

There are no theoretical reasons QEC can't exist. In fact it already does. Is it already good enough for universal fault tolerance? No. But then again no one said it would. We are slowly getting closer every year.

In his book, Dyakonov offers zero solid reasons other than "it's hard" and thus likely not possible. That's just an opinion.

analog31 6 days ago | parent | prev [-]

I took a QC course, and have done some reading, but am hardly an expert. But my impression has been: "This is analog computation." To reinforce the similarity, the error level of analog computers can be improved by running many of them in parallel.

vrighter 5 days ago | parent [-]

That gets you about 1 bit of extra precision every time you quadruple the number of parallel machines. (or rerun the computation 4x)

analog31 5 days ago | parent [-]

Yep. O(sqrt(n)) is a tough slog.