| ▲ | xscott 10 months ago |
| While I'm still eager to see where Quantum Computing leads, I've got a new threshold for "breakthrough": Until a quantum computer can factor products of primes larger than a few bits, I'll consider it a work in progress at best. |
|
| ▲ | Strilanc 10 months ago | parent | next [-] |
| If qubit count increased by 2x per year, largest-number-factored would show no progress for ~8 years. Then the largest number factored would double in size each year, with RSA2048 broken after a total of ~15 years. The initial lull is because the cost of error correction is so front loaded. Depending on your interests, the initial insensitivity of largest-number-factored as a metric is either great (it reduces distractions) or terrible (it fails to accurately report progress). For example, if the actual improvement rate were 10x per year instead of 2x per year, it'd be 3 years until you realized RSA2048 was going to break after 2 more years instead of 12 more years. |
| |
| ▲ | xscott 10 months ago | parent [-] | | What's the rough bit count of the largest numbers anyone's quantum computer can factor today? Breaking RSA2048 would be a huge breakthrough, but I'm wondering if they can even factor `221 = 13*17` yet (RSA8). And as I've mentioned elsewhere, the other QC problems I've seen sure seem like simulating a noisy circuit with a noisy circuit. But I know I don't know enough to say that with confidence. | | |
| ▲ | Strilanc 10 months ago | parent [-] | | Like I said above, the size of number that can be factored will sit still for years while error correction spins up. It'll be a good metric for progress later; it's a terrible metric for progress now. Too coarse. | | |
| ▲ | xscott 10 months ago | parent [-] | | Heh, that seems evasive. Good metric or not, it makes me think they aren't at the point where they can factor `15 = 3*5`. I'm not trying to disparage quantum computing. I think the topic is fascinating. At one point I even considered going back to school for a physics degree so I would have the background to understand it. | | |
| ▲ | Strilanc 10 months ago | parent [-] | | I'm not trying to be evasive. I'm directly saying quantum computers won't factor interesting numbers for years. That's more typically described as biting the bullet. There are several experiments that claim to factor 15 with a quantum computer (e.g. [1][2]). But beware these experiments cheat to various degrees (e.g. instead of performing period finding against multiplication mod 15 they do some simpler process known to have the same period). Even without cheating, 15 is a huge outlier in the simplicity of the modular arithmetic. For example, I think 15 is the only odd semiprime where you can implement modular multiplication by a constant using nothing but bit flips and bit swaps. Being so close to a power of 2 also doesn't hurt. Beware there's a constant annoying trickle of claims of factoring numbers larger than 15 with quantum computers, but using completely irrelevant methods where there's no reason to expect the costs to scale subexponentially. For example, Zapata (the quantum startup that recently went bankrupt) had one of those [3]. [1]: https://www.nature.com/articles/414883a [2]: https://arxiv.org/abs/1202.5707 [3]: https://scottaaronson.blog/?p=4447 | | |
|
|
|
|
|
| ▲ | UberFly 10 months ago | parent | prev | next [-] |
| I guess like most of these kinds of projects, it'll be smaller, less flashy breakthroughs or milestones along the way. |
| |
| ▲ | Terr_ 10 months ago | parent [-] | | People dramatically underestimate how important incremental unsung progress is, perhaps because it just doesn't make for a nice memorable story compared to Suddenly Great Person Has Amazing Idea Nobody Had Before. |
|
|
| ▲ | mrandish 10 months ago | parent | prev | next [-] |
| > While I'm still eager to see where Quantum Computing leads Agreed. Although I'm no expert in this domain, I've been watching it a long time as a hopeful fan. Recently I've been increasing my (currently small) estimated probability that quantum computing may not ever (or at least not in my lifetime), become a commercially viable replacement for SOTA classical computing to solve valuable real-world problems. I wish I knew enough to have a detailed argument but I don't. It's more of a concern triggered by reading media reports that seem to just assume "sure it's hard, but there's no doubt we'll get there eventually." While I agree quantum algorithms can solve valuable real-world problems in theory, it's pretty clear there are still a lot of unknown unknowns in getting all the way to "commercially viable replacement solving valuable real-world problems." It seems at least possible we may still discover some fundamental limit(s) preventing us from engineering a solution that's reliable enough and cost-effective enough to reach commercial viability at scale. I'd actually be interested in hearing counter-arguments that we now know enough to be reasonably confident it's mostly just "really hard engineering" left to solve. |
|
| ▲ | ashleyn 10 months ago | parent | prev | next [-] |
| My first question any time I see another quantum computing breakthrough: is my cryptography still safe? Answer seems like yes for now. |
| |
| ▲ | catqubit 10 months ago | parent | next [-] | | Depends on your personal use. In general to the ‘Is crypto still safe’ question, the answer is typically no - not because we have a quantum computer waiting in the wings ready to break RSA right now, but because of a) the longevity of the data we might need to secure and b) the transition time to migrate to new crypto schemes While the NIST post quantum crypto standards have been announced, there is still a long way to go for them to be reliably implemented across enterprises. Shor’s algorithm isn’t really going to be a real time decryption algorithm, it’s more of a ‘harvest now, decrypt later’ approach. | |
| ▲ | xscott 10 months ago | parent | prev [-] | | I have a pseudo-theory that the universe will never allow quantum physics to provide an answer to a problem where you didn't already know the result from some deterministic means. This will be some bizarre consequence of information theory colliding with the measurement problem. :-) | | |
| ▲ | rocqua 10 months ago | parent [-] | | You can use quantum computers to ask about the behavior of a random quantum computer. Google actually did this a while ago. And the result was better than a real computer could simulate. |
|
|
|
| ▲ | kridsdale1 10 months ago | parent | prev | next [-] |
| There will be a thousand breakthroughs before that point. |
| |
| ▲ | xscott 10 months ago | parent [-] | | That just means that the word "breakthrough" has lost it's meaning. I would suggest the word "advancement", but I know this is a losing battle. | | |
| ▲ | Suppafly 10 months ago | parent [-] | | >That just means that the word "breakthrough" has lost it's meaning. This. Small, incremental and predictable advances aren't breakthroughs. |
|
|
|
| ▲ | dekhn 10 months ago | parent | prev [-] |
| quantum computers can (should be able to; do not currently) solve many useful problems without ever being able to factor primes. |
| |
| ▲ | xscott 10 months ago | parent | next [-] | | What are some good examples? The one a few years ago where Google declared "quantum supremacy" sounded a lot like simulating a noisy circuit by implementing a noisy circuit. And that seems a lot like simulating the falling particles and their collisions in an hour glass by using a physical hour glass. | | |
| ▲ | dekhn 10 months ago | parent | next [-] | | The only one I can think of is simulating physical systems, especially quantum ones. Google's supremacy claim didn't impress me; besides being a computationally uninteresting problem, it really just motivated the supercomputer people to improve their algorithms. To really establish this field as a viable going concern probably needs somebody to do "something" with quantum that is experimentally verifiable but not computable classically, and is a useful computation. | | |
| ▲ | SAI_Peregrinus 10 months ago | parent [-] | | That is equivalent to proving BQP ≠ P. We currently don't know that any problem even exists that can be solved efficiently (in polynomial time) by quantum computers but not by classical computers. |
| |
| ▲ | EvgeniyZh 10 months ago | parent | prev [-] | | I wrote a long-ish comment about what you can expect of QC just yesterday https://news.ycombinator.com/item?id=42212878 | | |
| ▲ | xscott 10 months ago | parent [-] | | Thank you for the link. I appreciate the write-up. This sentence though: > breaking some cryptography schemes it not exactly the most exciting thing IMHO You're probably right that we'll migrate to QC-resistant algorithms before this happens, but if factoring was solved today, I think it would be very exciting :-) | | |
| ▲ | EvgeniyZh 10 months ago | parent [-] | | I think it would be very __impactful__, but it is not really useful for humanity, rather opposite. | | |
| ▲ | xscott 10 months ago | parent [-] | | Who knows. "It's difficult to make predictions, especially about the future", but it might be a good thing to accelerate switching to new crypto algorithms sooner, leaving fewer secrets to be dug up later. |
|
|
|
| |
| ▲ | Eji1700 10 months ago | parent | prev [-] | | Yeah I think that's the issue that makes it hard to assess quantum computing. My very layman understanding is that there are certain things it will be several orders of magnitude better at, but "simple" things for a normal machine quantum will be just as bad if not massively worse. It really should be treated as a different tool for right now. Maybe some day in the very far future if it becomes easier to make quantum computers an abstraction layer will be arrived at in some manner that means the end user thinks it's just like a normal computer, but from a "looking at series of 1/0's" or "looking at a series of superimposed particles" it's extremely different in function. |
|