Remix.run Logo
fxwin 5 days ago

The page only talks about adopting PQC for key agreement for SSH connections, not encryption in general so the overhead would be rather minimal here. Also from the FAQ:

"Quantum computers don't exist yet, why go to all this trouble?"

Because of the "store now, decrypt later" attack mentioned above. Traffic sent today is at risk of decryption unless post-quantum key agreement is used.

"I don't believe we'll ever get quantum computers. This is a waste of time"

Some people consider the task of scaling existing quantum computers up to the point where they can tackle cryptographic problems to be practically insurmountable. This is a possibilty. However, it appears that most of the barriers to a cryptographically-relevant quantum computer are engineering challenges rather than underlying physics. If we're right about quantum computers being practical, then we will have protected vast quantities of user data. If we're wrong about it, then all we'll have done is moved to cryptographic algorithms with stronger mathematical underpinnings.

Not sure if I'd take the cited paper (while fun to read) too seriously to inform my opinion the risks of using quantum-insecure encryption rather than as a cynical take on hype and window dressing in QC research.

sigmoid10 5 days ago | parent | next [-]

>it appears that most of the barriers to a cryptographically-relevant quantum computer are engineering challenges rather than underlying physics

I've heard this 15 years ago when I started university. People claimed all the basics were done, that we "only" needed to scale. That we would see practical quantum computers in 5-10 years. Today I still see the same estimates. Maybe 5 years by extreme optimists, 10-20 years by more reserved people. It's the same story as nuclear fusion. But who's prepping for unlimited energy today? Even though it would make sense to build future industrial environments around that if they want to be competitive.

fxwin 5 days ago | parent | next [-]

> People claimed all the basics were done, that we "only" needed to scale.

This claim is fundamentally different from what you quoted.

> But who's prepping for unlimited energy today?

It's about tradoffs: It costs almost nothing to switch to PQC methods, but i can't see a way to "prep for unlimited energy" that doesn't come with huge cost/time-waste in the case that doesn't happen

thayne 5 days ago | parent | next [-]

> It's about tradoffs: It costs almost nothing to switch to PQC methods,

It costs:

- development time to switch things over

- more computation, and thus more energy, because PQC algorithms aren't as efficient as classical ones

- more bandwidth, because PQC algorithms require larger keys

throw0101a 5 days ago | parent | next [-]

> It costs:

Not wrong, but given these algorithms are mostly used at setup, how much cost is actually being occurred compared to the entire session? Certainly if your sessions are short-lived then the 'overhead' of PQC/hybrid is higher, but I'd be curious to know the actually byte and energy costs over and above non-PQC/hybrid, i.e., how many bytes/joules for a non-PQC exchange and how many more by adding PQC. E.g.

> Unfortunately, many of the proposed post-quantum cryptographic primitives have significant drawbacks compared to existing mechanisms, in particular producing outputs that are much larger. For signatures, a state of the art classical signature scheme is Ed25519, which produces 64-byte signatures and 32-byte public keys, while for widely-used RSA-2048 the values are around 256 bytes for both. Compare this to the lowest security strength ML-DSA post-quantum signature scheme, which has signatures of 2,420 bytes (i.e., over 2kB!) and public keys that are also over a kB in size (1,312 bytes). For encryption, the equivalent would be comparing X25519 as a KEM (32-byte public keys and ciphertexts) with ML-KEM-512 (800-byte PK, 768-byte ciphertext).

* https://neilmadden.blog/2025/06/20/are-we-overthinking-post-...

"The impact of data-heavy, post-quantum TLS 1.3 on the Time-To-Last-Byte of real-world connections" (PDF):

* https://csrc.nist.gov/csrc/media/Events/2024/fifth-pqc-stand...

(And development time is also generally one-time.)

thayne 5 days ago | parent [-]

For an individual session, the cost is certainly small. But in aggregate it adds up.

I don't think the cost is large, and I agree that given the tradeoff, the cost is probably worth it, but there is a cost, and I'm not sure it can be categorized as "almost nothing".

djmdjm 5 days ago | parent | prev | next [-]

> - development time to switch things over

This is a one time cost, and generally the implementations we're switching to are better quality than the classical algorithms they replace. For instance, the implementation of ML-KEM we use in OpenSSH comes from Cryspen's libcrux[1], which is formally-verified and quite fast.

[1] https://github.com/cryspen/libcrux

> - more computation, and thus more energy, because PQC algorithms aren't as efficient as classical ones

ML-KEM is very fast. In OpenSSH it's much faster than classic DH at the same security level and only slightly slower than ECDH/X25519.

> - more bandwidth, because PQC algorithms require larger keys

For key agreement, it's barely noticeable. ML-KEM public keys are slightly over 1Kb. Again this is larger than ECDH but comparable to classic DH.

PQ signatures are larger, e.g. a ML-DSA signature is about 3Kb but again this only happens once or twice per SSH connection and is totally lost in the noise.

fxwin 5 days ago | parent | prev [-]

all of which are costs that pale in comparison to having your data compromised, depending on what it is

bee_rider 5 days ago | parent | prev [-]

Anyway, what does prepping for unlimited energy look like? I guess, favoring electrical over fossil fuels. But for normal people and the vast majority of companies, that looks like preparing for mass renewable electricity anyway, which is already a good thing to do.

thesz 5 days ago | parent | next [-]

With limitless energy you can have "fossil fuel" synthesized from air and water [1] and use existing "fossil fuel" infrastructure.

[1] https://www.wired.com/2012/10/fuel-from-air/

fxwin 5 days ago | parent | prev [-]

could also be just massively scaling up energy consumption with little concern for efficiency (since limitless would imply very low cost), which would probably be a bad idea for renewables, and in case of not-so-cheap energy also very expensive

unethical_ban 5 days ago | parent | prev | next [-]

The comparison to fusion power doesn't hold.

The costs to migrate to PQC continue to drop as they become mainstream algorithms. Second, the threat exists /now/ of organizations capturing encrypted data to decrypt later. There is no comparable current threat of "not preparing for fusion", whatever that entails.

dlubarov 5 days ago | parent | prev | next [-]

I would just take this to mean that most people are bad at estimating timelines for complex engineering tasks. 15 years isn't a ton of time, and the progress that has been made was done with pretty limited resources (compared to, say, traditional microprocessors).

spauldo 4 days ago | parent | prev [-]

Why would you think that fusion would give you unlimited energy? All it does is allow you to get energy from cheap, nearly unlimited fuel. You still have to produce, transmit, store, and distribute that energy.

It's great for the environment but for most people not much would change.

pclmulqdq 5 days ago | parent | prev | next [-]

It's been "engineering challenges" for 30 years. At some point, "engineering challenges" stops being a good excuse, and that point was about 20 years ago.

At some point, someone may discover some new physics that shows that all of these "engineering challenges" were actually a physics problem, but quantum physics hasn't really advanced in the last 30 years so it's understandable that the physicists are confused about what's wrong.

fxwin 5 days ago | parent | next [-]

You might be right that we'll never have quantum computers capable of cracking conventional cryptographic methods, but I'd rather err on the side of caution in this regard considering how easy it is to switch, and how disastrous it could be otherwise.

simiones 5 days ago | parent | next [-]

As others pointed out, it's not so easy to switch, as the PQC versions require much more data to be sent to establish a connection, and consequently way more CPU time. So the CPS you can achieve with this type of cryptography will be MUCH worse than classical algorithms.

ifwinterco 5 days ago | parent | next [-]

Let's be honest though, key exchange is not exactly the limiting factor for web performance in 2025

msgodel 5 days ago | parent [-]

It can be limiting for other things though. Encrypted DNS was already marginal for some TLD operators, adding the overhead of PQC may actually make it completely impractical.

fxwin 5 days ago | parent | prev [-]

it doesn't get much easier than that, and the downsides are much much much less of an inconvenience than having your data breached depending on what it is.

bbarnett 5 days ago | parent | prev | next [-]

Especially of the break through isn't public, and used behind the scenes.

westurner 5 days ago | parent | prev [-]

"A First Successful Factorization of RSA-2048 Integer by D-Wave Quantum Computer" (2025-06) https://ieeexplore.ieee.org/document/10817698

pclmulqdq 5 days ago | parent | next [-]

Yeah, except when your "2048-bit" numbers are guaranteed to have factors that differ by exactly two bits, you can factor them with any computer you want.

The D-wave also isn't capable of Shor's algorithm or any other quantum-accelerated version of this problem.

westurner 5 days ago | parent | next [-]

Have you or anyone else proven that there is no annealing implementation of Shor's?

Why are you claiming superiority in ignorance?

maratc 5 days ago | parent | prev [-]

I was at a lecture by a professor who's working in the field, his main argument was that quantum computers are physically impossible to scale.

He presented us with a picture of him and a number of other very important scientists in this field, none of them sharing his attitude. We then joked that there is a quantum entanglement of Nobel prize winners in the picture.

westurner 5 days ago | parent [-]

I don't think that that professor was correct.

The universe is constantly doing large, scaled quantum computations.

The number of error-corrected qubits per QC will probably increase at an exponential rate.

Whether there is a problem decomposition strategy for RSA could change.

Oh, entanglement and the prize! Adherence to Bell's is abstruse and obtuse. Like attaching to a student of Minkowkski's who served as an honorable patent examiner in Europe who moved to America. We might agree that there are many loopholes by which information sharing through entanglement is possible; that Bell's theorem is not a real limit to communications or QC because there are many "loopholes to"

mikestorrent 5 days ago | parent | prev | next [-]

D-Wave themselves do not emphasize this use case and have said many times that they don't expect annealing quantum computers to be used for this kind of decryption attack. Annealers are used for optimization problems where you're trying to find the lowest energy solution to a constraint problem, not Shor's Algorithm.

In that sense, they're more useful for normal folks today, and don't pose as many potential problems.

westurner 4 days ago | parent [-]

I suspect that we simply haven't yet found an annealing solution for factoring integers yet.

It may be that no solution exists; even given better error correction with that many qubits.

A standard LLM today won't yet answer with "no solution exists"

adgjlsfhk1 5 days ago | parent | prev [-]

By that argument, I can factor a 100000000 bit number on my computer in a second.

asah 4 days ago | parent | prev | next [-]

Some good ideas take a long time.

Nuclear energy got commercialized in 1957. The core technology was discovered nearly 50 years earlier.

Electricity was first discovered in ~1750 but commercialized in the late 1800s.

Faraday's experiments on electromagnetism were in 1830-1855 but commercialization took decades.

(The list goes on ...)

pclmulqdq 4 days ago | parent [-]

Your idea of "core technology" is about the first time a theory was discovered that had a technology as a consequence. That's the only way nuclear energy's "core technology" is discovered in 1907. By the same token, quantum computing's "core technology" was discovered in 1926 during Erwin Schrodinger's work formalizing wave equations for quantum systems. During those periods when technology takes a long time, both the underlying physics and the engineering makes steady advances. 100 years later, we still have very little idea how or why quantum superposition works.

wasabi991011 4 days ago | parent [-]

> 100 years later, we still have very little idea how or why quantum superposition works.

We understand superposition perfectly well. Maybe you are confusing science with philosophy.

Anyway, I'm starting to lose track of your point. There's definitely been steady advances in quantum technology, both in the underlying physics and in engineering. I'm not sure why you think that stopped.

pclmulqdq 4 days ago | parent [-]

What do you mean when you say "we understand superposition perfectly well"? To be very simplistic about this, are you proposing to know the physics of why entanglement can cause information to seemingly travel instantaneously over a distance when this seems to contradict what we know about the speed of light? Does this trigger no questions in your mind about some physical mechanism we don't understand here?

I understand that we have math that says that superposition does work, but we don't actually understand the physics of it. One of the foibles of modern physics is thinking that knowing the math is enough. Newton knew the math of his 100% internally consistent version of physics, but we know that there were observations that were not explained by his math that we now understand the physical mechanisms for.

I understand that "things that are beyond the math and physics I know" may be philosophy in your mind, but that is not a correct definition of philosophy.

wasabi991011 a day ago | parent [-]

>are you proposing to know the physics of why entanglement can cause information to seemingly travel instantaneously over a distance when this seems to contradict what we know about the speed of light?

I guess, in the sense that we know _it doesn't_. First of all, I'm pretty sure you are confusing superposition with entanglement. Second of all, entanglement doesn't transmit any information, it is purely a type of correlation. This is usually shown in most introductory quantum information or quantum computing courses. You can also find explanations on the physics stackexchange.

Superposition is just another word for the linearity of quantum systems.

Anyway, it's a hard question to figure out the limits between math, physics, and philosophy. A lot of physicists believe physics is about making useful mathematical models of reality, and trying to find better ones. Newton might disagree, but he's also been dead hundreds of years.

Anyway, please don't fall for the Dunning-Kruger effect. You clearly are only slightly familiar with quantum physics and have some serious misconceptions, but you sound very sure of yourself.

ziofill 5 days ago | parent | prev [-]

> quantum physics hasn't really advanced in the last 30 years so it's understandable that the physicists are confused about what's wrong.

I have my doubts about who’s the confused one. Quantum physics has advanced tremendously in the past 30 years. Do you realize we now have a scheme to break rsa 2048 with 1M noisy qubits? (See Gidney 2025)

pclmulqdq 4 days ago | parent | next [-]

Somehow, we have all these schemes to factor huge numbers, and yet the current record for actual implementation of Shor's algorithm and similar algorithms came factoring the number 15 in 2012. There was a recent paper about "factoring" 31, but that paper involved taking a number of simplifying steps assuming exactly that the number in use was a Mersenne number. People in this field keep showing "algorithm improvements" or "new devices" that are good enough to write a paper and yet somehow there's always an implementation problem or a translation problem when someone comes asking about using it.

If this algorithm exists and works, and there are chips with 1000 noisy qubits, why has nobody used this algorithm to factor a 16-bit number? Why haven't they used it to factor the number 63? Factoring 63 on a quantum computer using a generic algorithm would be a huge advancement in capability, but there's always some reason why your fancy algorithm doesn't work with another guy's fancy hardware.

At the same time, we continue to have no actual understanding of the actual underlying physics of quantum superposition, which is the principle on which this whole thing relies. We know that it happens and we have lots of equations that show that it happens and we have lots of algorithms that rely on it working, but we have continued to be blissfully unaware of why it happens (other than that the math of our theory says so). In the year 3000, physicists will be looking back at these magical parts of quantum theory with the same ridicule we use looking back at the magical parts of Newton's gravity.

ziofill 4 days ago | parent [-]

It’s clear you don’t know what you’re talking about.

pclmulqdq 4 days ago | parent [-]

If you are claiming to know what you're talking about, use one of these algorithms to factor the number 63 and you will get tenure.

The easiest way to prove that you do know what you're doing is to demonstrate it through making progress, which is something that this field refuses to do.

wasabi991011 5 days ago | parent | prev [-]

And that's not even a quantum physics advance, that's a purely algorithmic advance!

There's also been massive advances in terms of quantum engineering.

ktallett 5 days ago | parent | prev [-]

Those are two odd questions to even ask/answer as first quantum computers exist and secondly, we have them on a certain scale. I assume what they mean is at a scale to do calculations that surpass existing classical calculations.