| ▲ | glenstein 3 hours ago | |
Thank you, that seems to be the whole ball game for me right there. I understood the sarcastic tone as kind of exasperation, but it means something in the context of an extremely concerning attempt to ram through a questionable algorithm that is not well understood and risks a version of an NSA backdoor, and the only real protection would be integrity of standards adoptions processes like this one. You've really got to stick with the substance over the tone to be able to follow the ball here. Everyone was losing their minds over GDPR introducing a potential back door to encrypted chat apps that security agencies could access. This goes to the exact same category of concern, and as you note it has precedent! So yeah, NSA potentially sneaking a backdoor into an approved standard is pretty outrageous, and worth objecting to in strongest terms, and when that risk is present it should be subjected to the highest conceiveable standard of scrutiny. In fact, I found this to be the strongest point in the article - there's any number of alternatives that might (1) prove easier to implement, (2) prove more resilient to future attacks (3) turn out to be the most efficient. Just because you want to do something in the future doesn't mean it needs to be ML-KEM specifically, and the idea of throwing out ECC is almost completely inexplicable unless you're the NSA and you can't break it and you're trying to propose a new standard that doesn't include it. How is that not a hair on fire level concern? | ||