| ▲ | NSA and IETF, part 3: Dodging the issues at hand(blog.cr.yp.to) |
| 217 points by upofadown 5 hours ago | 73 comments |
| |
|
| ▲ | seethishat 4 hours ago | parent | next [-] |
| For context, djb has been doing and saying these things since he was a college student: While a graduate student at the University of California at Berkeley, Bernstein completed the development of an encryption equation (an "algorithm") he calls "Snuffle." Bernstein wishes to publish a) the algorithm (b) a mathematical paper describing and explaining the algorithm and (c) the "source code" for a computer program that incorporates the algorithm. Bernstein also wishes to discuss these items at mathematical conferences, college classrooms and other open public meetings. The Arms Export Control Act and the International Traffic in Arms Regulations (the ITAR regulatory scheme) required Bernstein to submit his ideas about cryptography to the government for review, to register as an arms dealer, and to apply for and obtain from the government a license to publish his ideas. Failure to do so would result in severe civil and criminal penalties. Bernstein believes this is a violation of his First Amendment rights and has sued the government.
After four years and one regulatory change, the Ninth Circuit Court of Appeals ruled that software source code was speech protected by the First Amendment and that the government's regulations preventing its publication were unconstitutional. - Source https://www.eff.org/cases/bernstein-v-us-dept-justice
|
|
| ▲ | dhx 3 hours ago | parent | prev | next [-] |
| Amongst the numerous reasons why you _don't_ want to rush into implementing new algorithms is even the _reference implementation_ (and most other early implementations) for Kyber/ML-KEM included multiple timing side channel vulnerabilities that allowed for key recovery.[1][2] djb has been consistent in view for decades that cryptography standards need to consider the foolproofness of implementation so that a minor implementation mistake specific to timing of specific instructions on specific CPU architectures, or specific compiler optimisations, etc doesn't break the implementation. See for example the many problems of NIST P-224/P-256/P-384 ECC curves which djb has been instrumental in fixing through widespread deployment of X25519.[3][4][5] [1] https://cryspen.com/post/ml-kem-implementation/ [2] https://kyberslash.cr.yp.to/faq.html / https://kyberslash.cr.yp.to/libraries.html [3] https://en.wikipedia.org/wiki/Elliptic_curve_point_multiplic... [4] https://safecurves.cr.yp.to/ladder.html [5] https://cr.yp.to/newelliptic/nistecc-20160106.pdf |
| |
| ▲ | glitchc 16 minutes ago | parent | next [-] | | This logic does not follow. Your argument seems to be "the implementation has security bugs, so let's not ratify the standard." That's not how standards work though. Ensuring an implementation is secure is part of the certification process. As long as the scheme itself is shown to be provably secure, that is sufficient to ratify a standard. If anything, standardization encourages more investment, which means more eyeballs to identify and plug those holes. | | |
| ▲ | johncolanduoni 5 minutes ago | parent | next [-] | | No, the argument is that the algorithm (as specified in the standard) is difficult to implement correctly, so we should tweak it/find another one. This is a property of the algorithm being specified, not just an individual implementation, and we’ve seen it play out over and over again in cryptography. I’d actually like to see more (non-cryptographic) standards take this into account. Many web standards are so complicated and/or ill-specified that trillion dollar market cap companies have trouble implementing them correctly/consistently. Standards shouldn’t just be thrown over the wall and have any problems blamed on the implementations. | |
| ▲ | arccy 10 minutes ago | parent | prev [-] | | this is like saying just use C and don't write any memory bugs. possible, but life could be a lot better if it weren't so easy to do so. |
| |
| ▲ | Foxboron 2 hours ago | parent | prev [-] | | > See for example the many problems of NIST P-224/P-256/P-384 ECC curves What are those problems exactly? The whitepaper from djb only makes vague claims about NSA being a malicious actor, but after ~20 years no known backdoors nor intentional weaknesses has been reliably proven? | | |
| ▲ | crote 14 minutes ago | parent | next [-] | | As I understand it, a big issue is that they are really hard to implement correctly. This means that backdoors and weaknesses might not exist in the theoretical algorithm, but still be common in real-world implementations. On the other hand, Curve25519 is designed from the ground up to be hard to implement incorrectly: there are very few footguns, gotchas, and edge cases. This means that real-world implementations are likely to be correct implementations of the theoretical algorithm. This means that, even if P-224/P-256/P-384 are on paper exactly as secure as Curve25519, they could still end up being significantly weaker in practice. | |
| ▲ | supernetworks_ 2 hours ago | parent | prev | next [-] | | It would be wise for people to remember that it’s worth doing basic sanity checks before making claims like no backdoors from the NSA. strong encryption has been restricted historically so we had things like DES and 3DES and Crypto AG. In the modern internet age juniper has a bad time with this one https://www.wired.com/2013/09/nsa-backdoor/. Usually it’s really hard to distinguish intent, and so it’s possible to develop plausible deniability with committees. Their track record isn’t perfect. With WPA3 cryptographers warned about the known pitfall of standardizing a timing sensitive PAKE, and Harkin got it through anyway. Since it was a standard, the WiFi committee gladly selected it anyway, and then resulted in dragonbleed among other bugs. The techniques for hash2curve have patched that | | |
| ▲ | UltraSane an hour ago | parent [-] | | The NSA changed the S-boxes in DES and this made people suspicious they had planted a back door but then when differential cryptanalysis was discovered people realized that the NSA changes to S-boxes made them more secure against it. | | |
| ▲ | timschmidt an hour ago | parent [-] | | That was 50 years ago. And since then we have an NSA employee co-authoring the paper which led to Heartbleed, the backdoor in Dual EC DRBG which has been successfully exploited by adversaries, and documentation from Snowden which confirms NSA compromise of standards setting committees. | | |
| ▲ | aw1621107 a minute ago | parent [-] | | > And since then we have an NSA employee co-authoring the paper which led to Heartbleed I'm confused as to what "the paper which led to Heartbleed" means. A paper proposing/describing the heartbeat extension? A paper proposing its implementation in OpenSSL? A paper describing the bug/exploit? Something else? And in addition to that, is there any connection between that author and the people who actually wrote the relevant (buggy) OpenSSL code? If the people who wrote the bug were entirely unrelated to the people authoring the paper then it's not clear to me why any blame should be placed on the paper authors. |
|
|
| |
| ▲ | chc4 an hour ago | parent | prev [-] | | They're vulnerable to "High-S" malleable signatures, while ed25519 isn't. No one is claiming they're backdoored (well, some people somewhere probably are), but they do have failure modes that ed25519 doesn't which is the GP's point. |
|
|
|
| ▲ | zahllos 3 hours ago | parent | prev | next [-] |
| In context, this particular issue is that DJB disagrees with the IETF publishing an ML-KEM only standard for key exchange. Here's the thing. The existence of a standard does not mean we need to use it for most of the internet. There will also be hybrid standards, and most of the rest of us can simply ignore the existence of ML-KEM -only. However, NSA's CNSA 2.0 (commercial cryptography you can sell to the US Federal Government) does not envisage using hybrid schemes. So there's some sense in having a standard for that purpose. Better developed through the IETF than forced on browser vendors directly by the US, I think. There was rough consensus to do this. Should we have a single-cipher kex standard for HQC too? I'd argue yes, and no the NSA don't propose to use it (unless they updated CNSA). The requirement of the NIST competition is that all standardized algorithms are both classical and PQ-resistant. Some have said in this thread that lattice crypto is relatively new, but it actually has quite some history, going back to Atjai in '97. If you want paranoia, there's always code theory based schemes going back to around '75. We don't know what we don't know, which is why there's HQC (code based) waiting on standardisation and an additional on-ramp for signatures, plus the expensive (size and sometimes statefulness) of hash-based options. So there's some argument that single-cipher is fine, and we have a whole set of alternative options. This particular overreaction appears to be yet another in a long running series of... disagreements with the entire NIST process, including "claims" around the security level of what we then called Kyber, insults to the NIST team's security level estimation in the form of suggesting they can't do basic arithmetic (given we can't factor anything bigger than 15 on a real quantum computer and we simply don't have hardware anywhere near breaking RSA, estimate is exactly what these are) and so on. |
| |
| ▲ | HelloNurse 2 hours ago | parent | next [-] | | The metaphor near the beginning of the article is a good summary: standardizing cars with seatbelts, but also cars without seatbelts. Since ML-KEM is supported by the NSA, it should be assumed to have a NSA-known backdoor that they want to be used as much as possible: IETF standardization is a great opportunity for a long term social engineering operation, much like DES, Clipper, the more recent funny elliptic curve, etc. | | |
| ▲ | zahllos an hour ago | parent | next [-] | | I will reply directly r.e. the analogy itself here. It is a poor one at best, because it assumes ML-KEM is akin to "internetting without cryptography". It isn't. If you want a better analogy, we have a seatbelt for cars right now. It turns out when you steal plutonium and hot-rod your DeLorean into a time machine, these seatbelts don't quite cut the mustard. So we need a new kind of seatbelt. We design one that should be as good for the school run as it is for time travel to 1955. We think we've done it but even after extensive testing we're not quite sure. So the debate is whether to put on two seatbelts (one traditional one we know works for traditional driving, and one that should be good for both) or if we can just use the new one on the school run and for going to 1955. We are nowhere near DeLoreans that can travel to 1955 either. | |
| ▲ | blintz an hour ago | parent | prev | next [-] | | > Since ML-KEM is supported by the NSA, it should be assumed to have a NSA-known backdoor that they want to be used as much as possible AES and RSA are also supported by the NSA, but that doesn’t mean they were backdoored. | | |
| ▲ | HelloNurse 20 minutes ago | parent | next [-] | | AES and RSA had enough public scrutiny to make backdooring backdoors imprudent. The standardization of an obviously weaker option than more established ones is difficult to explain with security reasons, so the default assumption should be that there are insecurity reasons. | |
| ▲ | zahllos 41 minutes ago | parent | prev [-] | | SHA-2 was designed by the NSA. Nobody is saying there is a backdoor. |
| |
| ▲ | MYEUHD an hour ago | parent | prev [-] | | > the more recent funny elliptic curve Can you elaborate please? | | |
| ▲ | zahllos an hour ago | parent | next [-] | | The commentor means Dual_EC, a random number generator. The backdoor was patented under the form of "escrow" here: https://patents.google.com/patent/US8396213B2/en?oq=USOO83.9... - replace "escrow" with "backdoor" everywhere in the text and what was done will fall out. ML-KEM/ML-DSA were adapted into standards by NIST, but I don't think a single American was involved in the actual initial design. There might be some weakness the NSA knows about that the rest of us don't, but the fact they're going ahead and recommending these be used for US government systems suggests they're fine with it. Unless they want to risk this vulnerability also being discovered by China/Russia and used to read large portions of USG internet traffic. In their position I would not be confident that if I was aware of a vulnerability it would remain secret, although I am not a US Citizen or even resident, and never have been. | |
| ▲ | rdtsc an hour ago | parent | prev [-] | | Not op, but they probably meant https://en.wikipedia.org/wiki/Dual_EC_DRBG |
|
| |
| ▲ | adgjlsfhk1 an hour ago | parent | prev | next [-] | | The problem with standardizing bad crypto options is that you are then exposed to all sorts of downgrade attack possibilities. There's a reason TLS1.3 removed all of the bad crypto algorithms that it had supported. | | |
| ▲ | ekr____ 25 minutes ago | parent | next [-] | | There were a number of things going on with TLS 1.3 and paring down the algorithm list. First, we both wanted to get rid of static RSA and standardize on a DH-style exchange. This also allowed us to move the first encrypted message in 1-RTT mode to the first flight from the server. You'll note that while TLS 1.3 supports KEMs for PQ, they are run in the opposite direction from TLS 1.2, with the client supplying the public key and the server signing the transcript, just as with DH. Second, TLS 1.3 made a number of changes to the negotiation which necessitated defining new code points, such as separating symmetric algorithm negotiation from asymmetric algorithm negotiation. When those new code points were defined, we just didn't register a lot of the older algorithms. In the specific case of symmetric algorithms, we also only. use AEAD-compatible encryption, which restricted the space further. Much of the motivation here was security, but it was also about implementation convenience because implementers didn't want to support a lot of algorithms for TLS 1.3. It's worth noting that at roughly the same time, TLS relaxed the rules for registering new code points, so that you can register them without an RFC. This allows people to reserve code points for their own usage, but doesn't require the IETF to get involved and (hopefully) reduces pressure on other implementers to actually support those code points. | |
| ▲ | blintz an hour ago | parent | prev [-] | | TLS 1.3 did do that, but it also fixed the ciphersuite negotiation mechanism (and got formally verified). So downgrade attacks are a moot point now. |
| |
| ▲ | crote 34 minutes ago | parent | prev | next [-] | | > In context, this particular issue is that DJB disagrees with the IETF publishing an ML-KEM only standard for key exchange. No, that's background dressing by now. The bigger issue is how IETF is trying to railroad a standard by violating its own procedures, ignoring all objections, and banning people who oppose it. They are literally doing the kind of thing we always accuse China of doing. ML-KEM-only is obviously being pushed for political reasons. If you're not willing to let a standard be discussed on its technical merits, why even pretend to have a technology-first industry working group? Seeing standards being corrupted like this is sickening. At least have the gall openly claim it should be standardized because it makes things easier for the NSA - and by extension (arguably) increasing national security! | |
| ▲ | vorpalhex 2 hours ago | parent | prev | next [-] | | The standard will be used, as it was the previous time the IETF allowed the NSA to standardize a known weak algorithm. Sorry that someone calling out a math error makes the NIST team feel stupid. Instead of dogpiling the person for not stroking their ego, maybe they should correct the error. Last I checked, a quantum computer wasn't needed to handle exponents, a whiteboard will do. | | |
| ▲ | zahllos an hour ago | parent [-] | | ML-KEM and ML-DSA are not "known weak". The justification for hybrid crypto is that they might have classical cryptanalytical results we aren't aware of, although there's a hardness reduction for lattice problems showing they're NP-hard, while we only suspect RSA+DLog are somewhere in NP. That's reasonable as a maximal-safety measure, but comes with additional cost. Obviously the standard will be used. As I said in a sibling comment, the US Government fully intends to do this whether the IETF makes a standard or not. |
| |
| ▲ | aaomidi 3 hours ago | parent | prev [-] | | Except when the government starts then mandating a specific algorithm. And yes. This has happened. There’s a reason there’s only the NIST P Curves in the WebPKI world. | | |
| ▲ | zahllos 2 hours ago | parent [-] | | "The government" already have. That's what CNSA 2.0 means - this is the commercial crypto NSA recommend for the US Government and what will be in FIPS/CAVP/CMVP. ML-KEM-only for most key exchange. In this context, it is largely irrelevant whether the IETF chooses or not to have a single-standard draft. There's a code point from IANA to do this in TLS already and it will happen for US Government systems. I'd also add that personally I consider NIST P-Curves to be absolutely fine crypto. Complete formula exist, so it's possible to have failure-free ops, although point-on-curve needs to be checked. They don't come with the small-order subgroup problem of any Montgomery curve. ECDSA isn't great alone, the hedged variants from RFC 6979 and later drafts should be used. Since ML-KEM is key exchange, X25519 is very widely used in TLS unless you need to turn it off for FIPS. For the certificate side, the actual WebPKI, I'm going to say RSA wins out (still) (I think). |
|
|
|
| ▲ | kiray 2 hours ago | parent | prev | next [-] |
| I have been tracking this for months. There is clearly at best, bias in moderation on the IETF list or possibly far worse. When djb was suspended for an innocuous reason, at the same time participants were engaging in activity that would usually be met with permabans (name calling, bullying, etc.). They were not banned. He's been up against serious adversity but continues to protect the lesser informed. This is why djb is in the Cypherpunks Hall of Fame! [1] [1] https://cypherpunkshall.github.io |
| |
| ▲ | Foxboron 2 hours ago | parent | next [-] | | > This is why djb is in the Cypherpunks Hall of Fame! [1] This is a list made by you 2 weeks ago? EDIT:
Okay lol. I actually browsed the list and found multiple dubious entries, along with Trump! Hilarious list. 10/10. | | |
| ▲ | jonesjohnson 2 hours ago | parent [-] | | what do you expect, when the tagline at the end of the page says "In crypto we trust."? Honestly, it's a bit sad. There are many great people on that list, but some seem a bit random and some are just straight up cryptobros, which makes the whole thing a joke, unfortunately |
| |
| ▲ | anonym29 2 hours ago | parent | prev [-] | | Name calling, bullying (forms of systematic harassment) and attempting to instill feelings of social isolation in a target are documented techniques employed by intelligence agencies in both online and offline discourse manipulation / information warfare. You can read up more here if you are curious: https://www.statewatch.org/media/documents/news/2015/jun/beh... Many of the attacks against djb line up quite nicely with "discredit" operational objectives. | | |
|
|
| ▲ | abhv 4 hours ago | parent | prev | next [-] |
| 20+2 (conditional support) versus 7. 22/29 = 76% in some form of "yea" That feels like "rough consensus" |
| |
| ▲ | stavros an hour ago | parent | next [-] | | > That OMB rule, in turn, defines "consensus" as follows: "general agreement, but not necessarily unanimity, and includes a process for attempting to resolve objections by interested parties, as long as all comments have been fairly considered, each objector is advised of the disposition of his or her objection(s) and the reasons why, and the consensus body members are given an opportunity to change their votes after reviewing the comments". From https://blog.cr.yp.to/20251004-weakened.html#standards, linked in TFA. | |
| ▲ | f33d5173 2 hours ago | parent | prev | next [-] | | A consensus is 100%. A rough consensus should be near 100%. 2/3 is a super majority. That's a very different standard. | |
| ▲ | jcranmer 3 hours ago | parent | prev | next [-] | | The standard used in the C and C++ committees is essentially a 2-to-1 majority in favor. I'm not aware of any committee where a 3-to-1 majority is insufficient to get an item to pass. DJB's argument that this isn't good enough would, by itself, be enough for me to route his objections to /dev/null; it's so tedious and snipey that it sours the quality of his other arguments by mere association. And overall, it gives the impression of someone who is more interested in derailing the entire process than in actually trying to craft a good standard. | | |
| ▲ | crote 41 minutes ago | parent | next [-] | | Standards - especially security-critical ones - shouldn't be a simple popularity contest. DJB provided lengthy, well-reasoned, and well-sourced arguments against adoption with his "nay" vote. The "aye" votes didn't make a meaningful counter-argument - in most cases they didn't even bother to make any argument at all and merely expressed support. This means there are clearly unresolved technical issues left - and not just the regular bikeshedding ones. If he'd been the only "nay" vote it might've been something which could be ignored as a mad hatter - but he wasn't. Six other people agreed with him. Considering the potential conflict of interest, the most prudent approach would be to route the unsubstantiated aye-votes to /dev/null: if you can't explain your vote, how can we be sure your vote hasn't been bought? | |
| ▲ | pfortuny 2 hours ago | parent | prev | next [-] | | You are turning “consensus” into “majority” and those it not the same. | | |
| ▲ | jcranmer an hour ago | parent [-] | | There was a recent discussion within the C committee over what exactly constituted consensus owing to a borderline vote that was surprisingly ruled "no consensus" (and the gravitas of the discussion was over the difference between a "no" and an "abstain" vote for consensus purposes). The decision was that it had to be a ⅔ favor/(favor + against), and ¾ (favor + neutral) / (favor + against + neutral). These are the actual rules of the committee now for determining consensus. Similar rules exist for the C++ committee. If there is any conflation going on, I am not the one doing it. |
| |
| ▲ | vorpalhex 2 hours ago | parent | prev [-] | | We're talking about a landmine in a crypto spec and you're bikeshedding about consensus ratios. We should talk about the NSA designed landmine. |
| |
| ▲ | ImPostingOnHN 9 minutes ago | parent | prev [-] | | consensus is not a synonym for majority, supermajority, or for any fraction of the whole, unless the fraction is 100% |
|
|
| ▲ | blintz an hour ago | parent | prev | next [-] |
| Standardizing a codepoint for a pure ML-KEM version of TLS is fine. TLS clients always get to choose what ciphersuites they support, and nothing forces you to use it. He has essentially accused anyone who shares this view of secretly working for the NSA. This is ridiculous. You can see him do this on the mailing list: https://mailarchive.ietf.org/arch/browse/tls/?q=djb |
| |
| ▲ | dataflow 34 minutes ago | parent | next [-] | | > standardizing a code point (literally a number) for a pure ML-KEM version of TLS is fine. TLS clients always get to choose what ciphersuites they support, and nothing forces you to use it. I think the whole point is that some people would be forced to use it due to other standards picking previously-standardized ciphers. He explains and cites examples of this in the past. > He has essentially accused anyone who shares this view of secretly working for the NSA. This is ridiculous. He comes with historical and procedural evidence of bad faith. Why is this ridiculous? If you see half the submitted ciphers being broken, and lies and distortions being used to shove the others through, and historical evidence of the NSA using standards as a means to weaken ciphers, why wouldn't you equate that to working for the NSA (or something equally bad)? | |
| ▲ | ImPostingOnHN 2 minutes ago | parent | prev [-] | | Sunlight is the best disinfectant. I see one group of people shining it and another shading the first group. |
|
|
| ▲ | pverheggen an hour ago | parent | prev | next [-] |
| While it's true that six others unequivocally opposed adoption, we don't know how many of those oppose the chairs claiming they have consensus. This may be a normal ratio to move forward with adoption, you'd have to look at past IETF proceeding to get a sense for that. One other factor which comes in to play, some people can't stand his communication style. When disagreed with, he tends to dig in his heels and write lengthly responses that question people's motives, like in this blog post and others. Accusing the chairs of corruption may have influenced how seriously his complaint was taken. |
| |
| ▲ | dataflow 42 minutes ago | parent | next [-] | | > One other factor which comes in to play, some people can't stand his communication style. When disagreed with, he tends to dig in his heels and write lengthly responses that question people's motives, like in this blog post and others. I don't have context on this other than the linked page, but if what he's saying is accurate, it does seem pretty damning and corrupt, no? Why all the lies and distortions otherwise - how does one assume a generous explanation for lies and distortions? | |
| ▲ | ImPostingOnHN 6 minutes ago | parent | prev [-] | | > Accusing the chairs of corruption may have influenced how seriously his complaint was taken. If you alter your official treatment of somebody because they suggested you might be corrupt (in other words, because of your personal feelings), then you have just confirmed their suggestion. |
|
|
| ▲ | philipwhiuk 4 hours ago | parent | prev | next [-] |
| For an employee at NIST who operates a NIST email address to claim they have no association with NIST is farcical: https://web.archive.org/web/20251122075555/https://mailarchi... https://www.nist.gov/people/quynh-dang |
| |
| ▲ | amszmidt 4 hours ago | parent | next [-] | | ”No association” and “I am not a representative” are quite different things to say. | | |
| ▲ | philipwhiuk 4 hours ago | parent [-] | | You represent your organisation regardless of whether you cloak yourself in an alternate email | | |
| ▲ | amszmidt 3 hours ago | parent | next [-] | | An employee doesn’t act as an official representative of their employer nor do they speak for the employee in any official capacity. That is what the message says. The informal also didn’t cloak their identity (implies some malicious intent), they simple did not use their work email. Nothing wrong with that. | |
| ▲ | conception 3 hours ago | parent | prev [-] | | I’m sorry, can you state which organization you are speaking for with this comment? It wasn’t immediately clear. |
|
| |
| ▲ | 6581 4 hours ago | parent | prev | next [-] | | That's not what the message you linked claims at all. Maybe you missed the "in this message" at the end of the sentence? | | |
| ▲ | philipwhiuk 4 hours ago | parent [-] | | No not really - I don’t think choosing to post from an alternative email removes the association issue that the original intent is trying to capture. |
| |
| ▲ | hosteur 3 hours ago | parent | prev [-] | | What is your agenda? |
|
|
| ▲ | jancsika an hour ago | parent | prev | next [-] |
| Dear some seasoned cryptographer, Please ELI5: what is the argument for including the option for the non-hybrid option in this standard? Is it a good argument in your expert opinion? My pea brain: implementers plus options equals bad, newfangled minus entrenched equals bad, alice only trust option 1 but bob only have option 2 = my pea brain hurt! |
|
| ▲ | throw0101a 4 hours ago | parent | prev | next [-] |
| Perhaps related: from 2022, on his (FOIA?) lawsuit against the government: * https://news.ycombinator.com/item?id=32360533 From 2023, "Debunking NIST's calculation of the Kyber-512 security level": * https://news.ycombinator.com/item?id=37756656 |
|
| ▲ | 0xbadcafebee 2 hours ago | parent | prev | next [-] |
| tl;dr DJB is trying to stop the NSA railroading bad crypto into TLS standards, the objections deadline is in two days, and they're stonewalling him This /. story fills in the backstory: https://it.slashdot.org/story/25/11/23/226258/cryptologist-d... Normal practice in deploying post-quantum cryptography is to deploy ECC+PQ. IETF's TLS working group is standardizing ECC+PQ. But IETF management is also non-consensually ramming a particular NSA-driven document through the IETF process, a "non-hybrid" document that adds just PQ as another TLS option.
|
|
| ▲ | g-mork 4 hours ago | parent | prev | next [-] |
| Handforth Parish council Internet edition. You have no authority here, djb! No authority at all |
|
| ▲ | ants_everywhere 4 hours ago | parent | prev | next [-] |
| D. J. Bernstein is very well respected and for very good reason. And I don't have firsthand knowledge of the background here, but the blog posts about the incident have been written in a kind of weird voice that make me feel like I'm reading about the US Government suppressing evidence of Bigfoot or something. Stuff like this > Wow, look at that: "due process".... Could it possibly be that the people writing the law were thinking through how standardization processes could be abused?" is both accusing the other party of bad faith and also heavily using sarcasm, which is a sort of performative bad faith. Sarcasm can be really effective when used well. But when a post is dripping with sarcasm and accusing others of bad faith it comes off as hiding a weak position behind contempt. I don't know if this is just how DJB writes, or if he's adopting this voice because he thinks it's what the internet wants to see right now. Personally, I would prefer a style where he says only what he means without irony and expresses his feelings directly. If showing contempt is essential to the piece, then the Linus Torvalds style of explicit theatrical contempt is probably preferable, at least to me. I understand others may feel differently. The style just gives me crackpot vibes and that may color reception of the blog posts to people who don't know DJT's reputation. |
| |
| ▲ | amiga386 4 hours ago | parent | next [-] | | It's very simple. ECC is well understood and has not been broken over many years. ML-KEM is new, and hasn't had the same scrutiny as ECC. It's possible that the NSA already knows how to break this, and has chosen not to tell us, and NIST plays the useful idiot. NIST has played the useful idiot before, when it promoted Dual_EC_DRBG, and the US government paid RSA to make it the default CSPRNG in their crypto libraries for everyone else... but eventually word got out that it's almost certainly an NSA NOBUS special, and everyone started disabling it. Knowing all that, and planning for a future where quantum computers might defeat ECC -- it's not defeated yet, and nobody knows when in the future that might happen... would you choose: Option A): encrypt key exchange with ECC and the new unproven algorithm Option B): throw out ECC and just use the new unproven algorithm NIST tells you option B is for the best. NIST told you to use Dual_EC_DRBG. W3C adopted EME at the behest of Microsoft, Google and Netflix. Microsoft told you OOXML is a valid international standard you should use instead of OpenDocument (and it just so happens that only one piece of software, made by Microsoft, correctly reads and writes OOXML). So it goes on. Standards organisations are very easily corruptable when its members are allowed to have conflicts of interest and politick and rules-lawyer the organisation into adopting their pet standards. | | |
| ▲ | jcranmer 2 hours ago | parent | next [-] | | > Standards organisations are very easily corruptable when its members are allowed to have conflicts of interest and politick and rules-lawyer the organisation into adopting their pet standards. FWIW, in my experience on standardization committees, the worst example I've seen of rules-lawyering to drive standards changes is... what DJB's doing right now. There's a couple of other egregious examples I can think of, where people advocating against controversial features go in full rules-lawyer mode to (unsuccessfully) get the feature pulled. I've never actually seen any controversial feature make it into a standard because of rules-lawyering. | | |
| ▲ | dataflow 24 minutes ago | parent [-] | | What exactly are you calling "rules-lawyering"? Is citing rules and pointing out their blatant violation "rules-lawyering"? If so, can you explain why it is better to avoid this, and what should be done instead? As an outsider I'd understand it differently: reading rules and pointing out their lack of violation (perhaps in letter), when people feel like you violated it (perhaps in spirit), is what would be rules-layering. You're agreeing on what the written rules are, but interpreting actions as following vs. violating them. That's quite different from an accusation of rules violation followed by silence or distortions or outright lies. If someone is pointing out that you're violating the rules and you're lying or staying silent or distorting the facts, you simply don't get to dismiss or smear them with a label like "rules-lawyer". For rules to be followed, people have to be able to enforce them. Otherwise it's just theater. |
| |
| ▲ | glenstein 3 hours ago | parent | prev [-] | | Thank you, that seems to be the whole ball game for me right there. I understood the sarcastic tone as kind of exasperation, but it means something in the context of an extremely concerning attempt to ram through a questionable algorithm that is not well understood and risks a version of an NSA backdoor, and the only real protection would be integrity of standards adoptions processes like this one. You've really got to stick with the substance over the tone to be able to follow the ball here. Everyone was losing their minds over GDPR introducing a potential back door to encrypted chat apps that security agencies could access. This goes to the exact same category of concern, and as you note it has precedent! So yeah, NSA potentially sneaking a backdoor into an approved standard is pretty outrageous, and worth objecting to in strongest terms, and when that risk is present it should be subjected to the highest conceiveable standard of scrutiny. In fact, I found this to be the strongest point in the article - there's any number of alternatives that might (1) prove easier to implement, (2) prove more resilient to future attacks (3) turn out to be the most efficient. Just because you want to do something in the future doesn't mean it needs to be ML-KEM specifically, and the idea of throwing out ECC is almost completely inexplicable unless you're the NSA and you can't break it and you're trying to propose a new standard that doesn't include it. How is that not a hair on fire level concern? |
| |
| ▲ | jonstewart 4 hours ago | parent | prev [-] | | He’s smart and prolific, for sure, but I lost respect for him several years ago. | | |
| ▲ | johnisgood 4 hours ago | parent [-] | | Why, if I might respectfully ask? | | |
| ▲ | jonstewart 3 hours ago | parent [-] | | Sure! First, while I’m in no position to judge cryptographic algorithms, the success of cha-cha and 25519 speak for themselves. More prosaically, patriecia/critbit trees and his other tools are the right thing, and foresighted. He’s not just smart, but also prolific. However, he’s left a wake of combative controversy his entire career, of the “crackpot” type the parent comment notes, and at some point it’d be worth his asking, AITA? Second, his unconditional support of Jacob Appelbaum has been bonkers. He’s obviously smart and uncompromising but, despite having been in the right on some issues, his scorched earth approach/lack of judgment seems to have turned his paranoia about everyone being out to get him into a self-fulfilling prophecy. | | |
| ▲ | philodeon 30 minutes ago | parent | next [-] | | Jacob Appelbaum has not been convicted of any crimes. At least in the United States, you are innocent until proven guilty. At least one of the Appelbaum accusers is a self-admitted schizophrenic who has been committed to a mental institution as recently as this year. | |
| ▲ | johnisgood 2 hours ago | parent | prev [-] | | I do not understand your last paragraph. :/ |
|
|
|
|
|
| ▲ | GauntletWizard an hour ago | parent | prev [-] |
| The NSA has railroaded bad crypto before [1]. The correct answer is to just ignore it, to say "okay, this is the NSA's preferred backdoored crypto standard, and none of our actual implementations will support it." It is not acceptable for the government to be forcing bad crypto down our throats, it is not acceptable for the NSA to be poisoning the well this way, but for all I respect DJB, they are "playing the game" and 20 to 7 is consensus. [1] https://en.wikipedia.org/wiki/Dual_EC_DRBG |