| ▲ | johncolanduoni 44 minutes ago | |
No, the argument is that the algorithm (as specified in the standard) is difficult to implement correctly, so we should tweak it/find another one. This is a property of the algorithm being specified, not just an individual implementation, and we’ve seen it play out over and over again in cryptography. I’d actually like to see more (non-cryptographic) standards take this into account. Many web standards are so complicated and/or ill-specified that trillion dollar market cap companies have trouble implementing them correctly/consistently. Standards shouldn’t just be thrown over the wall and have any problems blamed on the implementations. | ||
| ▲ | glitchc 2 minutes ago | parent [-] | |
> No, the argument is that the algorithm (as specified in the standard) is difficult to implement correctly, so we should tweak it/find another one. This argument is without merit. ML-KEM/Kyber has already been ratified as the PQC KEM standard by NIST. What you are proposing is that the NIST process was fundamentally flawed. This is a claim that requires serious evidence as backup. | ||