Remix.run Logo
leptons 6 days ago

I've never seen so much money spent on a fundamentally flawed tech, since maybe Theranos. I'm really starting to doubt the viability of the current crop of quantum computing attempts. I think there probably is some way to harness quantum effects, but I'm not sure computing with inherently high margin of error is the right way to do it.

rockemsockem 6 days ago | parent | next [-]

I feel like these are extremely different things being compared.

For a lot of technology, most really, the best way to study how to improve it is to make the best thing you know how to and then work on trying to make it better. That's what's been done with all the current quantum computing attempts. Pretty much all of the industry labs with general purpose quantum computers can in fact run programs on them, they just haven't reached the point where they're running programs that are useful beyond proving out and testing the system.

sesm 6 days ago | parent | prev | next [-]

I'm optimistic about current quantum computers, because they are a tool to study wave function collapse. I hope that they will help to understand the relation between the number of particles and a time how long a system can stay in entangled state, which will point to a physical interpretation of quantum mechanics (different from "we don't talk about wave function collapse" Copenhagen interpretation).

nickpsecurity 6 days ago | parent [-]

The non-experts here might be interested in why you’d want to do that. Do you have explanations or links about it?

tsimionescu 5 days ago | parent [-]

In short, quantum mechanics has a major issue at its core: quantum states evolve by purely deterministic, fully time reversible, evolutions of the wave function. But, once a classical apparatus measures a quantum system, the wave function collapses to a single point corresponding to the measurement result. This collapse is non-deterministic, and not time reversible.

It is also completely undefined in the theory: the theory doesn't say anything at all about what interaction constitutes "a quantum interaction", that keeps you in the deterministic time evolution regime; and what interactions constitute "a measurement" and collapse the wave function.

So, this is a major gap in the core of quantum mechanics. Quantum computers are all about keeping the qubits in the deterministic evolution state while running the program, and performing a measurement only at the end to get a classical result out of it (and then repeating that measurement a bunch of times, because this is a statistical computation). So, the hope is that they might shed some light on how to presicsely separate quantum interactions from measurements.

nickpsecurity 5 days ago | parent [-]

Wow, that is a huge gap. Thanks for the explanation.

zeroxfe 5 days ago | parent | prev | next [-]

> fundamentally flawed tech, since maybe Theranos

That's a pretty dramatic claim. We've had to (and still have to) deal with the same class of problems when going from analog -> digital in chips, communications, optics, etc. etc. The primitives that reality gives us to work with are not discrete.

benreesman 6 days ago | parent | prev [-]

I think quantum computing research makes a lot more sense through the lens of “real scientists had to do something for funding while string theory was going on”.

Quantum computing may or may not get industrial results in the next N years, but those folks do theory, they often if not usually (in)validate it by experiment: it’s science.