| ▲ | alchemist1e9 6 days ago |
| And these are the same quantum computers that will eventually break ecliptic curve cryptography? Now I’m very confused. |
|
| ▲ | griffzhowl 6 days ago | parent | next [-] |
| If we can build a machine with enough coherent qubits, then it'll be able to break ECC. As it turns out, that's a big if, but the bigness of the if is about hardware implementation. The theory behind it is just basic quantum mechanics |
| |
|
| ▲ | wongarsu 6 days ago | parent | prev | next [-] |
| In many applications you are concerned about data you encrypt today still being secure 20 years from now. Especially if your threat model includes a malicious actor holding on to data in hopes of decrypting it later. From the article it sounds like we will still be safe for 20+ years. On the other hand 15 was just extraordinarily easy, progress after 21 will be much quicker. And we never know which breakthroughs might come in the next decades that speed up progress. |
| |
| ▲ | oh_my_goodness 6 days ago | parent [-] | | "progress after 21 will be much quicker" Can you provide a quick verification for that? | | |
| ▲ | wongarsu 6 days ago | parent [-] | | The article lists three reasons why 21 is so much harder than 15. Mostly that with 15 most of the required conditional multiplications are multiplications by 1 and the remaining multiplication can be done with cheap shifts because it's a multiplication by a power of 2 modulo 15 (which is one less than a power of two). But 22 and 24 are in the same boat as 21 here. All three of them require computing only factors that are not one, all three are not one less than a factor of 2. You need slightly more multiplications (and thus more gates) as the numbers get larger, but that only grows linearly. Maybe the conditional multiplications required get slightly more expensive to implement, but I wouldn't expect a 100x cost blowup from that. Error correction is still an issue, potentially making a linear complexity increase quadratic, but qubit counts in quantum computers also increase at an exponential rate | | |
| ▲ | oh_my_goodness 6 days ago | parent | next [-] | | 24 has 3 as a factor. 3 is one less than a power of 2. | | |
| ▲ | freehorse 6 days ago | parent | next [-] | | Well n=21 too but the solution for n=15 used that 15 is one less than a power of 2, not its divisors, because we are living in modulo n. | | | |
| ▲ | Dylan16807 6 days ago | parent | prev [-] | | That's a bit off, but more importantly using information about the answer to make the circuit cheaper is cheating. |
| |
| ▲ | oh_my_goodness 6 days ago | parent | prev [-] | | 23 and 29 are prime. | | |
| ▲ | wongarsu 6 days ago | parent [-] | | That's what I get for not enough coffee. Same holds for 22 and 24 though |
|
|
|
|
|
| ▲ | bArray 6 days ago | parent | prev | next [-] |
| I think the idea is that it doesn’t matter if it takes billions of gates to achieve, the point is that it can do it very fast. If we thought a table sized FPGA could do it, a state actor would most definitely build one. |
| |
| ▲ | lazide 6 days ago | parent [-] | | theoretically The practical problem is that ‘noise’ between gates seems to increase exponentially, so practically it may actually be impossible to construct anything with more than a handful of gates for the foreseeable (possibly indefinite?) future. It’s essentially the crypto version of Fusion. | | |
| ▲ | EthanHeilman 6 days ago | parent [-] | | Quantum error correction addresses this problem and we now have tech demos showing that we can build scalable QCs with surface codes [0]. [0] https://scottaaronson.blog/?p=8525 | | |
| ▲ | Der_Einzige 6 days ago | parent | next [-] | | This, like all other purported advancements in quantum error correction, is a meme with zero practical impact. | | | |
| ▲ | lazide 6 days ago | parent | prev [-] | | Cool, so we should totally be able to factor 21 (or larger numbers)…. When? | | |
| ▲ | EthanHeilman 6 days ago | parent [-] | | You brought up gate noise, there has been quite a bit of progress on that problem. > so we should totally be able to factor 21 (or larger numbers)…. When? Just because we solve one problem doesn't imply all the problems in QC are also instantly solved. I guess it does if you assume noise is the only problem and once is it solved the engineering is trivial. That is not the case. Even assuming all foundational problems have been solved, figuring out how actually engineer and also mass produce large numbers of gates, will take a while. As the article pointed out, going from 15 to 21 requires a 100x increase in gates. As the article that you posted under says: "Because of the large cost of quantum factoring numbers (that aren’t 15), factoring isn’t yet a good benchmark for tracking the progress of quantum computers. If you want to stay abreast of progress in quantum computing, you should be paying attention to the arrival quantum error correction (such as surface codes getting more reliable as their size is increased) and to architectures solving core scaling challenges (such as lost neutral atoms being continuously replaced)." |
|
|
|
|
|
| ▲ | Mistletoe 6 days ago | parent | prev | next [-] |
| This is the reality a million clickbait fearmongering articles won’t tell you. And pr machines for quantum computing companies won’t either. |
|
| ▲ | privatelypublic 6 days ago | parent | prev | next [-] |
| Hasn't classical already severely crippled ECC because of some mathematical Assumptions that somebody came back in 2022 and Proved were wrong? |
| |
| ▲ | cwmma 6 days ago | parent | next [-] | | I believe you are thinking of "Supersingular isogeny Diffie–Hellman key exchange" or SIKE which is a post quantum encryption algorithm that was spectacularly broken a couple years ago. The math involves elliptical curves but it's different from the elliptical curve cryptography used in your browser. | |
| ▲ | kevindamm 6 days ago | parent | prev | next [-] | | Which assumptions? ECDLP is still considered computationally hard, and ECC considered secure. There are invalid curve attacks and small subgroup attacks but that's a problem with key selection, not a fundamental problem with ECC. Do you have a citation? | |
| ▲ | markusde 6 days ago | parent | prev | next [-] | | Could you link to any more information about this? | |
| ▲ | bitexploder 6 days ago | parent | prev [-] | | Not in general, no. It is still secure and used. There are of course attacks but those were not completely breaking |
|
|
| ▲ | z3phyr 6 days ago | parent | prev | next [-] |
| "Is this what you can conjure Saruman?" I have a strong belief that new mathematical tools and methods can be developed that can make it "easy" to break a lot of modern cryptography primitives without ever using a quantum computer. |
|
| ▲ | santiagobasulto 6 days ago | parent | prev [-] |
| The potential is there, we haven't made it yet. It's the same with AI, AGI and all that. Imagine if you'd read a response from GPT-2 back in 2019, you'd also be like "and these are the same models that will eventually give us AGI". |
| |
| ▲ | heyjamesknight 6 days ago | parent | next [-] | | Not a great analogy, since there’s zero chance the kinds of model involved in GPT-2 will give us AGI. | | |
| ▲ | ACCount37 6 days ago | parent [-] | | Zero? Aren't you a little bit overconfident on that? Transformer LLMs already gave us the most general AI as of yet, by far - and they keep getting developed further, with a number of recent breakthroughs and milestones. | | |
| ▲ | heyjamesknight 4 days ago | parent | next [-] | | No. The fundamental encoding unit of an LLM is semantic. Mapping reality into semantic space is a form of lossy compression. There are entire categories of experience that can't be properly modeled in semantic space. Even in "multimodal" models, text is still the fundamental unit of data storage and transformation between the modes. That's not the case for how your brain works—you don't see a pigeon, label it as "pigeon," and then refer to your knowledge about "pigeons". You just experience the pigeon. We have 100K years of homo sapiens thriving without language. "General Intelligence" occurs at a level above semantics. | |
| ▲ | bigfickpm 6 days ago | parent | prev [-] | | [dead] |
|
| |
| ▲ | fmbb 6 days ago | parent | prev [-] | | Imagine if you'd read a response from GPT-5 in 2025, you'd also be like "and these are the same models that will eventually give us AGI". |
|