Remix.run Logo
dhx 4 hours ago

Amongst the numerous reasons why you _don't_ want to rush into implementing new algorithms is even the _reference implementation_ (and most other early implementations) for Kyber/ML-KEM included multiple timing side channel vulnerabilities that allowed for key recovery.[1][2]

djb has been consistent in view for decades that cryptography standards need to consider the foolproofness of implementation so that a minor implementation mistake specific to timing of specific instructions on specific CPU architectures, or specific compiler optimisations, etc doesn't break the implementation. See for example the many problems of NIST P-224/P-256/P-384 ECC curves which djb has been instrumental in fixing through widespread deployment of X25519.[3][4][5]

[1] https://cryspen.com/post/ml-kem-implementation/

[2] https://kyberslash.cr.yp.to/faq.html / https://kyberslash.cr.yp.to/libraries.html

[3] https://en.wikipedia.org/wiki/Elliptic_curve_point_multiplic...

[4] https://safecurves.cr.yp.to/ladder.html

[5] https://cr.yp.to/newelliptic/nistecc-20160106.pdf

glitchc an hour ago | parent | next [-]

This logic does not follow. Your argument seems to be "the implementation has security bugs, so let's not ratify the standard." That's not how standards work though. Ensuring an implementation is secure is part of the certification process. As long as the scheme itself is shown to be provably secure, that is sufficient to ratify a standard.

If anything, standardization encourages more investment, which means more eyeballs to identify and plug those holes.

johncolanduoni 43 minutes ago | parent | next [-]

No, the argument is that the algorithm (as specified in the standard) is difficult to implement correctly, so we should tweak it/find another one. This is a property of the algorithm being specified, not just an individual implementation, and we’ve seen it play out over and over again in cryptography.

I’d actually like to see more (non-cryptographic) standards take this into account. Many web standards are so complicated and/or ill-specified that trillion dollar market cap companies have trouble implementing them correctly/consistently. Standards shouldn’t just be thrown over the wall and have any problems blamed on the implementations.

glitchc a minute ago | parent [-]

> No, the argument is that the algorithm (as specified in the standard) is difficult to implement correctly, so we should tweak it/find another one.

This argument is without merit. ML-KEM/Kyber has already been ratified as the PQC KEM standard by NIST. What you are proposing is that the NIST process was fundamentally flawed. This is a claim that requires serious evidence as backup.

arccy an hour ago | parent | prev [-]

this is like saying just use C and don't write any memory bugs. possible, but life could be a lot better if it weren't so easy to do so.

johncolanduoni 36 minutes ago | parent | next [-]

Great, you’ve just convinced every C programmer to use a hand rolled AES implementation on their next embedded device. Only slightly joking.

glitchc 3 minutes ago | parent | prev [-]

Yeah except there are certified versions of AES written in C. Which makes your point what exactly?

Foxboron 3 hours ago | parent | prev [-]

> See for example the many problems of NIST P-224/P-256/P-384 ECC curves

What are those problems exactly? The whitepaper from djb only makes vague claims about NSA being a malicious actor, but after ~20 years no known backdoors nor intentional weaknesses has been reliably proven?

crote an hour ago | parent | next [-]

As I understand it, a big issue is that they are really hard to implement correctly. This means that backdoors and weaknesses might not exist in the theoretical algorithm, but still be common in real-world implementations.

On the other hand, Curve25519 is designed from the ground up to be hard to implement incorrectly: there are very few footguns, gotchas, and edge cases. This means that real-world implementations are likely to be correct implementations of the theoretical algorithm.

This means that, even if P-224/P-256/P-384 are on paper exactly as secure as Curve25519, they could still end up being significantly weaker in practice.

supernetworks_ 3 hours ago | parent | prev | next [-]

It would be wise for people to remember that it’s worth doing basic sanity checks before making claims like no backdoors from the NSA. strong encryption has been restricted historically so we had things like DES and 3DES and Crypto AG. In the modern internet age juniper has a bad time with this one https://www.wired.com/2013/09/nsa-backdoor/.

Usually it’s really hard to distinguish intent, and so it’s possible to develop plausible deniability with committees. Their track record isn’t perfect.

With WPA3 cryptographers warned about the known pitfall of standardizing a timing sensitive PAKE, and Harkin got it through anyway. Since it was a standard, the WiFi committee gladly selected it anyway, and then resulted in dragonbleed among other bugs. The techniques for hash2curve have patched that

UltraSane 2 hours ago | parent [-]

The NSA changed the S-boxes in DES and this made people suspicious they had planted a back door but then when differential cryptanalysis was discovered people realized that the NSA changes to S-boxes made them more secure against it.

timschmidt 2 hours ago | parent [-]

That was 50 years ago. And since then we have an NSA employee co-authoring the paper which led to Heartbleed, the backdoor in Dual EC DRBG which has been successfully exploited by adversaries, and documentation from Snowden which confirms NSA compromise of standards setting committees.

aw1621107 39 minutes ago | parent [-]

> And since then we have an NSA employee co-authoring the paper which led to Heartbleed

I'm confused as to what "the paper which led to Heartbleed" means. A paper proposing/describing the heartbeat extension? A paper proposing its implementation in OpenSSL? A paper describing the bug/exploit? Something else?

And in addition to that, is there any connection between that author and the people who actually wrote the relevant (buggy) OpenSSL code? If the people who wrote the bug were entirely unrelated to the people authoring the paper then it's not clear to me why any blame should be placed on the paper authors.

timschmidt 33 minutes ago | parent [-]

> I'm confused

The original paper which proposed the OpenSSL Heartbeat extension was written by two people, one worked for NSA and one was a student at the time who went on to work for BND, the "German NSA". The paper authors also wrote the extension.

I know this because when it happened, I wanted to know who was responsible for making me patch all my servers, so I dug through the OpenSSL patch stream to find the authors.

tptacek 17 minutes ago | parent | next [-]

What does that paper say about implementing the TLS Heartbeat extension with a trivial uninitialized buffer bug?

aw1621107 29 minutes ago | parent | prev [-]

Ah, that clears up the confusion. Thank you for taking the time to explain!

chc4 2 hours ago | parent | prev [-]

They're vulnerable to "High-S" malleable signatures, while ed25519 isn't. No one is claiming they're backdoored (well, some people somewhere probably are), but they do have failure modes that ed25519 doesn't which is the GP's point.