| |
| ▲ | upofadown 16 hours ago | parent | next [-] | | Are you referring to "Encrypted message malleability checks are incorrectly enforced causing plaintext recovery attacks"? Seems like a legitimate difference of opinion. The researcher wants a message with an invalid format to return an integrity failure message. Presumably the GnuPGP project thinks that would be better handled by some sort of bad format error. The exploit here is a variation on the age old idea of tricking a PGP user into decrypting an encrypted message and then sending the result to the attacker. The novelty here is the idea of making the encrypted message look like a PGP key (identity) and then asking the victim to decrypt the fake key, sign it and then upload it to a keyserver. Modifying a PGP message file will break the normal PGP authentication[1] (that was not acknowledged in the attack description). So here is the exploit: * The victim receives a unauthenticated/anonymous (unsigned or with a broken signature) message from the attacker. The message looks like a public key. * Somehow (perhaps in another anonymous message) the attacker claims they are someone the victim knows and asks them to decrypt, sign and upload the signed public key to a keyserver. * They see nothing wrong with any of this and actually do what the attacker wants ignoring the error message about the bad message format. So this attack is also quite unlikely. Possibly that affected the decision of the GnuPG project to not change behaviour in this case, particularly when such a change could possibly introduce other vulnerabilities. [1] https://articles.59.ca/doku.php?id=pgpfan:pgpauth Added: Wait. How would the victim import the bogus PGP key into GPG so they could sign it? There would normally be a preexisting key for that user so the bogus key would for sure fail to import. It would probably fail anyway. It will be interesting to see what the GnuPG project said about this in their response. | | |
| ▲ | tptacek 16 hours ago | parent [-] | | In the course of this attack, just in terms of what happens in the mechanics of the actual protocol, irrespective of the scenario in which these capabilities are abused, the attacker: (1) Rewrites the ciphertext of a PGP message (2) Introducing an entire new PGP packet (3) That flips GPG into DEFLATE compression handling (4) And then reroutes the handling of the subsequent real message (5) Into something parsed as a plaintext comment This happens without a security message, but rather just (apparently) a zlib error. In the scenario presented at CCC, they used the keyserver example to demonstrate plaintext exfiltration. I kind of don't care. It's what's happening under the hood that's batshit; the "difference of opinion" is that the GnuPG maintainers (and, I guess, you) think this is an acceptable end state for an encryption tool. |
| |
| ▲ | akulbe 17 hours ago | parent | prev [-] | | Is there a better alternative to GPG? | | |
| ▲ | tptacek 17 hours ago | parent | next [-] | | Everything is better than PGP (not just GPG --- all PGP implementations). The problem with PGP is that it's a Swiss Army Knife. It does too many things. The scissors on a Swiss Army Knife are useful in a pinch if you don't have real scissors, but tailors use real scissors. Whatever it is you're trying to do with encryption, you should use the real tool designed for that task. Different tasks want altogether different cryptosystems with different tradeoffs. There's no one perfect multitasking tool. When you look at the problem that way, surprisingly few real-world problems ask for "encrypt a file". People need backup, but backup demands backup cryptosystems, which do much more than just encrypt individual files. People need messaging, but messaging is wildly more complicated than file encryption. And of course people want packet signatures, ironically PGP's most mainstream usage, ironic because it relies on only a tiny fraction of PGP's functionality and still somehow doesn't work. All that is before you get to the absolutely deranged 1990s design of PGP, which is a complex state machine that switches between different modes of operation based on attacker-controlled records (which are mostly invisible to users). Nothing modern looks like PGP, because PGP's underlying design predates modern cryptography. It survives only because nerds have a parasocial relationship with it. | | |
| ▲ | palata 15 hours ago | parent | next [-] | | > It survives only because nerds have a parasocial relationship with it. I really would like to replace PGP with the "better" tool, but: * Using my Yubikey for signing (e.g. for git) has a better UX with PGP instead of SSH * I have to use PGP to sign packages I send to Maven Maybe I am a nerd emotionally attached to PGP, but after a year signing with SSH, I went back to PGP and it was so much better... | | |
| ▲ | computerfriend 10 hours ago | parent [-] | | > better UX with PGP instead of SSH This might be true of comparing GPG to SSH-via-PIV, but there's a better way with far superior UX: derive an SSH key from a FIDO2 slot on the YubiKey. | | |
| ▲ | palata 3 hours ago | parent [-] | | I do it with FIDO2. It's inconvenient when having multiple Yubikeys (I always end up adding the entry manually with ssh-agent), and I have to touch the Yubikey everytime it signs. That makes it very annoying when rebasing a few tens of commits, for instance. With GPG it just works. | | |
| ▲ | ahlCVA an hour ago | parent [-] | | For what it's worth: You can set no-touch-required on a key (it's a generation-time option though). | | |
| ▲ | palata 16 minutes ago | parent [-] | | Sure, but then it is set to no-touch for every FIDO2 interaction I have. I don't want to touch for signing, but I want to touch when using it as a passkey, for instance. |
|
|
|
| |
| ▲ | josephg 5 hours ago | parent | prev | next [-] | | The thing I can't get past with PGP / GPG is that it tries to work around MITM attacks by encouraging users to place their social network on the public record (via public key attestation). This is so insane to me. The whole point of using cryptography is to keep private information private. Its hard to think of ways PGP could fail more as a security / privacy tool. | | |
| ▲ | upofadown 2 hours ago | parent [-] | | Do you mean keyservers? Keyservers have nothing to do with the identity verification required to prevent MITM attacks. There is only one method available for PGP. Comparison of key fingerprints/IDs. Keyservers are simply a convenient way to get a public key (identity). Most people don't have to use them. |
| |
| ▲ | johnisgood 17 hours ago | parent | prev [-] | | Now can you give us a list of all the features of PGP and a tool that does one specific thing really well? | | |
| ▲ | akerl_ 17 hours ago | parent | next [-] | | https://www.latacora.com/blog/2019/07/16/the-pgp-problem/#th... | | |
| ▲ | jhgb 15 hours ago | parent | next [-] | | > Use Signal. Or Wire, or WhatsApp, or some other Signal-protocol-based secure messenger. That's a "great" idea considering the recent legal developments in the EU, which OpenPGP, as bad as it is, doesn't suffer from. It would be great if the author updated his advice into something more future-proof. | | |
| ▲ | akerl_ 15 hours ago | parent | next [-] | | There's no future-proof suggestion that's immune to the government declaring it a crime. If you want a suggestion for secure messaging, it's Signal/WhatsApp. If you want to LARP at security with a handful of other folks, GPG is a fine way to do that. | | |
| ▲ | jhgb 14 hours ago | parent | next [-] | | Nobody decided that it's a crime, and it's unlikely to happen. Question is, what do you do with mandatory snooping of centralized proprietary services that renders them functionally useless aside from "just live with it". I was hoping for actual advice rather than a snarky non-response, yet here we are. | | |
| ▲ | Fnoord 13 hours ago | parent | next [-] | | > Nobody decided that it's a crime, and it's unlikely to happen. Which jurisdiction are you on about? [1] Pick your poison. For example, UK has a law forcing suspects to cooperate. This law has been used to convict suspects who weren't cooperating. NL does not, but police can use force to have a suspect unlock a device using finger or face. [1] https://en.wikipedia.org/wiki/Key_disclosure_law | |
| ▲ | akerl_ 14 hours ago | parent | prev | next [-] | | I gave you the answer that exists: I'm not aware of any existing or likely-to-exist secure messaging solution that would be a viable recommendation. The available open-source options come nowhere close to the messaging security that Signal/Whatsapp provide. So you're left with either "find a way to access Signal after they pull out of whatever region has criminalized them operating with a backdoor on comms" or "pick any option that doesn't actually have strong messaging security". | | |
| ▲ | johnisgood 12 hours ago | parent [-] | | > messaging security > WhatsApp Eh? There are alternatives, try Ricochet (Refresh) or Cwtch. | | |
| ▲ | akerl_ 12 hours ago | parent [-] | | I stand by what I said. | | |
| ▲ | johnisgood 6 hours ago | parent [-] | | I mean... why? | | |
| ▲ | closewith 4 hours ago | parent [-] | | Not the GP, but most of us want to communicate with other people, which means SMS or WhatsApp. No point have perfect one-time-pad encryption and no one to share pads with. |
|
|
|
| |
| ▲ | closewith 4 hours ago | parent | prev [-] | | You're asking for a technical solution to a political problem. The answer is not to live with it, but become politically active to try to support your principles. No software can save you from an authoritarian government - you can let that fantasy die. |
| |
| ▲ | anonym29 12 hours ago | parent | prev [-] | | Could you please link the source code for the WhatsApp client, so that we can see the cryptographic keys aren't being stored and later uploaded to Meta's servers, completely defeating the entire point of Signal's E2EE implementation and ratchet protocol? | | |
| ▲ | akerl_ 11 hours ago | parent [-] | | This may shock you, but plenty of cutting-edge application security analysis doesn't start with source code. There are many reasons, but one of them is that for the overwhelming majority of humans on the planet, their apps aren't being compiled from source on their device. So since you have to account for the fact that the app in the App Store may not be what's in some git repo, you may as well just start with the compiled/distributed app. | | |
| ▲ | anonym29 11 hours ago | parent [-] | | Whether or not other people build from source code has zero relevance to a discussion about the trustworthiness of security promises coming from former PRISM data providers about the closed-source software they distribute. Source availability isn't theater, even when most people never read it, let alone build from it. The existence of surreptitious backdoors and dynamic analysis isn't a knock against source availability. Signal and WhatsApp do not belong in the same sentence together. One's open source software developed and distributed by a nonprofit foundation with a lengthy history of preserving and advancing accessible, trustworthy, verifiable encrypted calling and messaging going back to TextSecure and RedPhone, the other's a piece of proprietary software developed and distributed by a for-profit corporation whose entire business model is bulk harvesting of user data, with a lengthy history of misleading and manipulating their own users and distributing user data (including message contents) to shady data brokers and intelligence agencies. To imply these two offer even a semblance of equivalent privacy expectations is misguided, to put it generously. |
|
|
| |
| ▲ | 14 hours ago | parent | prev [-] | | [deleted] |
| |
| ▲ | johnisgood 17 hours ago | parent | prev [-] | | Saw it, not impressed, GnuPG has a lot of more features than signing and file encryption. And there are lots of tools for file encryption anyways. I have a bash function using openssh, sometimes I use croc (also uses PAKE), etc. I need an alternative to "gpg --encrypt --armor --recipient <foo>". :) | | |
| ▲ | akerl_ 17 hours ago | parent | next [-] | | I guess we'll have to live with you being unimpressed. | |
| ▲ | some_furry 16 hours ago | parent | prev | next [-] | | > I need an alternative to "gpg --encrypt --armor --recipient <foo>" That's literally age. https://github.com/FiloSottile/age | | |
| ▲ | johnisgood 16 hours ago | parent [-] | | No, because there is no keyring and you have to supply people's public key each time. It is not suitable for large-scale public key management (with unknown recipients), and it does not support automatic discovery, trust management. Age does NOT SUPPORT signing at all either. | | |
| ▲ | amluto 12 hours ago | parent | next [-] | | > you have to supply people's public key each time Keyrings are awful. I want to supply people’s public keys each time. I have never, in my entire time using cryptography, wanted my tool to guess or infer what key to verify with. (Heck, JOSE has a long history of bugs because it infers the key type, which is also a mistake.) I have an actual commercial use case that receives messages (which are, awkwardly, files sent over various FTP-like protocols, sigh), decrypts and verifies them, and further processes them. This is fully automated and runs as a service. For horrible legacy reasons, the files are in PGP format. I know the public key with which they are signed (provisioned out of band) and I have the private key for decryption (again, provisioned out of band). This would be approximately two lines of code using any sane crypto library [0], but there really isn’t an amazing GnuPG alternative that’s compatible enough. But GnuPG has keyrings, and it really wants to use them and to find them in some home directory. And it wants to identify keys by 32-bit truncated hashes. And it wants to use Web of Trust. And it wants to support a zillion awful formats from the nineties using wildly insecure C code. All of this is actively counterproductive. Even ignoring potential implementation bugs, I have far more code to deal with key rings than actual gpg invocation for useful crypto. [0] I should really not have to even think about the interaction between decryption and verification. Authenticated decryption should be one operation, or possibly two. But if it’s two, it’s one operation to decapsulate a session key and a second operation to perform authenticated decryption using that key. | | |
| ▲ | mkesper 2 hours ago | parent | next [-] | | Some years ago I wrote "just a little script" to handle encrypting password-store secrets for multiple recipients. It got quite ugly and much more verbose than planned, switching gpg output parsing to Python for sanity.
I think I used a combination of --keyring <mykeyring> --no-default-keyring.
Never would encourage anyone to do this again. | |
| ▲ | upofadown 2 hours ago | parent | prev [-] | | >And it wants to identify keys by 32-bit truncated hashes. That's 64 bits these days. >I should really not have to even think about the interaction between decryption and verification. Messaging involves two verifications. One to insure that you are sending the message to who you think you are sending the message. The other to insure that you know who you received a message from. That is an inherent problem. Yes, you can use a shared key for this but then you end up doing both verifications manually. | | |
| ▲ | amluto 8 minutes ago | parent [-] | | >> And it wants to identify keys by 32-bit truncated hashes. > That's 64 bits these days. The fact that it’s short enough that I even need to think about whether it’s a problem is, frankly, pathetic. > Messaging involves two verifications. One to insure that you are sending the message to who you think you are sending the message. The other to insure that you know who you received a message from. That is an inherent problem. Yes, you can use a shared key for this but then you end up doing both verifications manually. I can’t quite tell what you mean. One can build protocols that do encrypt-then-sign, encrypt-and-sign, sign-then-encrypt, or something clever that combines encryption and signing. Encrypt-then-sign has a nice security proof, the other two combinations are often somewhat catastrophically wrong, and using a high quality combination can have good performance and nice security proofs. But all of the above should be the job of the designer of a protocol, not the user of the software. If my peer sends me a message, I should provision keys, and then I should pass those keys to my crypto library along with a message I received (and perhaps whatever session state is needed to detect replays), and my library should either (a) tell me that the message is invalid and not give me a guess as to its contents or (b) tell me it’s valid and give me the contents. I should not need to separately handle decryption and verification, and I should not even be able to do them separately even if I want to. |
|
| |
| ▲ | some_furry 16 hours ago | parent | prev [-] | | Why is a keyring important to you? Would "fetch a short-lived age public key" serve your use case? If so, then an age plugin that build atop the AuxData feature in my Fediverse Public Key Directory spec might be a solution. https://github.com/fedi-e2ee/public-key-directory-specificat... But either way, you shouldn't have long-lived public keys used for confidentiality. It's a bad design to do that. | | |
| ▲ | deknos 4 hours ago | parent | next [-] | | We need a keyring at a company. Because there's no other media for communicating, where you reach management and technical people in companies as well. And we have massive issues due to the fact that the ongoing-decrying of "shut everything off" and the following non-improvement-without-an-alternative because we have to talk with people of other organizations (and every organization runs their own mailserver) and the only really common way of communication is Mail. And when everyone has a GPG Key, you get.. what? an keyring. You could say, we do not need gpg, because we control the mailserver, but what if a mailserver is compromised and the mails are still in mailboxes? the public keys are not that public, only known to the contenders, still, it's an issue and we have a keyring | |
| ▲ | johnisgood 16 hours ago | parent | prev [-] | | > you shouldn't have long-lived public keys used for confidentiality. This statement is generic and misleading. Using long-lived keys for confidentiality is bad in real-time messaging, but for non-ephemeral use cases (file encryption, backups, archives) it is completely fine AND desired. > Would "fetch a short-lived age public key" serve your use case? Sadly no. | | |
| ▲ | soatok 16 hours ago | parent [-] | | (This is some_furry, I'm currently rate-limited. I thought this warranted a reply, so I switched to this account to break past the limit for a single comment.) > This statement is generic and misleading. It may be generic, but it's not misleading. > Using long-lived keys for confidentiality is bad in real-time messaging, but for non-ephemeral use cases (file encryption, backups, archives) it is completely fine. What exactly do you mean by "long-lived"? The "lifetime" of a key being years (for a long-lived backup) is less important than how many encryptions are performed with said key. The thing you don't want is to encrypt 2^50 messages under the same key. Even if it's cryptographically safe to do that, any post-compromise key rotation will be a fucking nightmare. The primary reason to use short-lived public keys is to limit the blast radius. Consider these two companies: Alice Corp. uses the same public key for 30+ years. Bob Ltd. uses a new public key for each quarter over the same time period. Both parties might retain the secret key indefinitely, so that if Bob Ltd. needs to retrieve a backup from 22 years ago, they still can. Now consider what happens if both of them lose their currently-in-use secret key due to a Heartbleed-style attack. Alice has 30 years of disaster recovery to contend with, while Bob only has up to 90 days. Additionally, file encryption, backups, and archives typically use ephemeral symmetric keys at the bottom of the protocol. Even when a password-based key derivation function is used (and passwords are, for whatever reason, reused), the password hashing function usually has a random salt, thereby guaranteeing uniqueness. The idea that "backups" magically mean "long-lived" keys are on the table, without nuance, is extremely misleading. > > Would "fetch a short-lived age public key" serve your use case? > Sadly no. shrug Then, ultimately, there is no way to securely satisfy your use case. | | |
| ▲ | johnisgood 16 hours ago | parent [-] | | You introduced "short-lived" vs "long-lived", not me. Long-lived as wall-clock time (months, years) is the default interpretation in this context. The Alice / Bob comparison is asymmetric in a misleading way. You state Bob Ltd retains all private keys indefinitely. A Heartbleed-style attack on their key storage infrastructure still compromises 30 years of backups, not 90 days. Rotation only helps if only the current operational key is exposed, which is an optimistic threat model you did not specify. Additionally, your symmetric key point actually supports what I said. If data is encrypted with ephemeral symmetric keys and the asymmetric key only wraps those, the long-lived asymmetric key's exposure does not enable bulk decryption without obtaining each wrapped key individually. > "There is no way to securely satisfy your use case" No need to be so dismissive. Personal backup encryption with a long-lived key, passphrase-protected private key, and offline storage is a legitimate threat model. Real-world systems validate this: SSH host keys, KMS master keys, and yes, even PGP, all use long-lived asymmetric keys for confidentiality in non-ephemeral contexts. And to add to this, incidentally, age (the tool you mentioned) was designed with long-lived recipient keys as the expected use case. There is no built-in key rotation or expiry mechanism because the authors considered it unnecessary for file encryption. If long-lived keys for confidentiality were inherently problematic, age would be a flawed design (so you might want to take it up with them, too). In any case, yeah, your point about high-fan-out keys with large blast radius is correct. That is different from "long-lived keys are bad for confidentiality" (see above with regarding to "age"). | | |
| ▲ | maxtaco 15 hours ago | parent | next [-] | | An intended use case for FOKS (https://foks.pub) is to allow long-lived durable shared secrets between users and teams with key rotation when needed. | |
| ▲ | stackghost 10 hours ago | parent | prev [-] | | >Personal backup encryption with a long-lived key, passphrase-protected private key, and offline storage is a legitimate threat model ... If you're going to use a passphrase anyway why not just use a symmetric cipher? In fact for file storage why not use an encrypted disk volume so you don't need to use PGP? | | |
| ▲ | johnisgood 6 hours ago | parent [-] | | That was just me being goofy in that bit (and only that), but I hope the rest of my message went across. :) > In fact for file storage why not use an encrypted disk volume so you don't need to use PGP? Different threat models. Disk encryption (LUKS, VeraCrypt, plain dm-crypt) protects against physical theft. Once mounted, everything is plaintext to any process with access. File-level encryption protects files at rest and in transit: backups to untrusted storage, sharing with specific recipients, storing on systems you do not fully control. You cannot send someone a LUKS volume to decrypt one file, and backups of a mounted encrypted volume are plaintext unless you add another layer. |
|
|
|
|
|
|
| |
| ▲ | baobun 16 hours ago | parent | prev [-] | | sq (sequoia) should be able to sort that. | | |
|
| |
| ▲ | miki123211 4 hours ago | parent | prev | next [-] | | This is exactly that, in more detail than you could possibly ever ask for: https://soatok.blog/2024/11/15/what-to-use-instead-of-pgp/ | |
| ▲ | some_furry 17 hours ago | parent | prev [-] | | https://soatok.blog/2024/11/15/what-to-use-instead-of-pgp/ I wrote this to answer this exact question last year. | | |
| ▲ | palata 2 hours ago | parent | next [-] | | > The only downside to Sigstore is it hasn’t been widely adopted yet. Which, from where I stand, means that PGP is the only viable solution because I don't have a choice. I can't replace PGP with Sigstore when publishing to Maven. It's nice to tell me I'm dumb because I use PGP, but really it's not my choice. > Use SSH Signatures, not PGP signatures. Here I guess it's just me being dumb on my own. Using SSH signatures with my Yubikeys (FIDO2) is very inconvenient. Using PGP signatures with my Yubikeys literally just works. > Encrypted Email: Don’t encrypt email. I like this one, I keep seeing it. Sounds like Apple's developer support: if I need to do something and ask for help, the answer is often: "Don't do it. We suggest you only use the stuff that just works and be happy about it". Sometimes I have to use emails, and cryptographers say "in that case just send everything in plaintext because eventually some of your emails will be sent in plaintext anyway". Isn't it like saying "no need to use Signal, eventually the phone of one of your contacts will be compromised anyway"? | |
| ▲ | xeonmc 7 hours ago | parent | prev [-] | | offtopic question: as a recent dabbling reader of introductory popsci content in cryptography, I've been wondering about what are the different segmentation of expert roles in the field? e.g. in Filippo's blogpost about Age he clarified that he's not a cryptographer but rather a cryptography engineer, is that also what your role is, what are the concrete divisions of labor, and what other related but separate positions exists in the overall landscape? where is the cutoff point of "don't roll your own crypto" in the different levels of expertise? | | |
| ▲ | johnisgood 6 hours ago | parent [-] | | You did not ask me, but you should do your due diligence because there are way too many armchair cryptographers around here. |
|
|
|
| |
| ▲ | coppsilgold 17 hours ago | parent | prev | next [-] | | Depending on what you are after, an alternative could be using SSH keys for signatures and age[1] for encryption targeting SSH keys. [1] <https://github.com/FiloSottile/age> | |
| ▲ | baobun 16 hours ago | parent | prev | next [-] | | sq (sequoia) is compatible and is available in your favorite distro. It's the recommended replacement. https://book.sequoia-pgp.org/about_sequoia.html | | |
| ▲ | zimmerfrei 7 hours ago | parent [-] | | This is the right answer. The problem mostly concerns the oldest parts of PGP (the protocol), which gpg (the implementation) doesn't want or cannot get rid of. |
| |
| ▲ | vbezhenar 25 minutes ago | parent | prev | next [-] | | age | |
| ▲ | 17 hours ago | parent | prev [-] | | [deleted] |
|
|
| |
| ▲ | akerl_ 17 hours ago | parent | next [-] | | It's not like GPG solves for secure key distribution. GPG keyservers are a mess, and you can't trust their contents anyways unless you have an out of band way to validate the public key. Basically nobody is using web-of-trust for this in the way that GPG envisioned. This is why basically every modern usage of GPG either doesn't rely on key distribution (because you already know what key you want to trust via a pre-established channel) or devolves to the other party serving up their pubkey over HTTPS on their website. | | |
| ▲ | 65a 16 hours ago | parent [-] | | Yes, not saying that web of trust ever worked. "Pre-established channel" are the other mechanisms I mentioned, like a central authority (https) or TOFU (just trust the first key you get). All of these have some issues, that any alternative must also solve for. | | |
| ▲ | akerl_ 16 hours ago | parent [-] | | So if we need a pre-established channel anyways, why would people recommending a replacement for GPG workflows need to solve for secure key distribution? This is a bit like looking at electric cars and saying ~"well you can't claim to be a viable replacement for gas cars until you can solve flight" |
|
| |
| ▲ | woodruffw 17 hours ago | parent | prev | next [-] | | A lot of people are using PGP for things that don’t require any kind of key distribution. If you’re just using it to encrypt files (even between pointwise parties), you can probably just switch to age. (We’re also long past the point where key distribution has been a significant component of the PGP ecosystem. The PGP web of trust and original key servers have been dead and buried for years.) | |
| ▲ | kaoD 17 hours ago | parent | prev [-] | | This is not the first time I see "secure key distribution" mentioned in HN+(GPG alternatives) context and I'm a bit puzzled. What do you mean? Web of Trust? Keyservers? A combination of both? Under what use case? | | |
| ▲ | kpil 17 hours ago | parent | next [-] | | I'm assuming they mean the old way of signing each others signatures. As a practical implementation of "six degrees of Kevin Bacon", you could get an organic trust chain to random people. Or at least, more realistically, to few nerds. I think I signed 3-4 peoples signatures. The process had - as they say - a low WAF. | | |
| ▲ | dale_glass 16 hours ago | parent [-] | | > As a practical implementation of "six degrees of Kevin Bacon", you could get an organic trust chain to random people. GPG is terrible at that. 0. Alice's GPG trusts Alice's key tautologically.
1. Alice's GPG can trust Bob's key because it can see Alice's signature.
2. Alice's GPG can trust Carol's key because Alice has Bob's key, and Carol's key is signed by Bob. After that, things break. GPG has no tools for finding longer paths like Alice -> Bob -> ??? -> signature on some .tar.gz. I'm in the "strong set", I can find a path to damn near anything, but only with a lot of effort. The good way used to be using the path finder, some random website maintained by some random guy that disappeared years ago. The bad way is downloading a .tar.gz, checking the signature, fetching the key, then fetching every key that signed in, in the hopes somebody you know signed one of those, and so on. And GPG is terrible at dealing with that, it hates having tens of thousands of keys in your keyring from such experiments. GPG never grew into the modern era. It was made for persons who mostly know each other directly. Addressing the problem of finding a way to verify the keys of random free software developers isn't something it ever did well. | | |
| ▲ | tptacek 16 hours ago | parent [-] | | What's funny about this is that the whole idea of the "web of trust" was (and, as you demonstrate, is) literally PGP punting on this problem. That's how they talked about it at the time, in the 90s, when the concept was introduced! But now the precise mechanics of that punt have become a critically important PGP feature. | | |
| ▲ | dale_glass 15 hours ago | parent [-] | | I don't think it punted as much as it never had that as an intended usage case. I vaguely recall the PGP manuals talking about scenarios like a woman secretly communicating with her lover, or Bob introducing Carol to Alice, and people reading fingerprints over the phone. I don't think long trust chains and the use case of finding a trust path to some random software maintainer on the other side of the planet were part of the intended design. I think to the extent the Web of Trust was supposed to work, it was assumed you'd have some familiarity with everyone along the chain, and work through it step by step. Alice would known Bob, who'd introduce his friend Carol, who'd introduce her friend Dave. |
|
|
| |
| ▲ | 65a 16 hours ago | parent | prev [-] | | In a signature context, you probably want someone else to know that "you" signed it (I can think of other cases, but that's the usual one). The way to do that requires them to know that the key which signed the data belongs to you. My only point is that this is actually the hard part, which any "replacement" crypto system needs to solve for, and that solving that is hard (none of the methods are particularly good). | | |
| ▲ | Avamander 10 hours ago | parent | next [-] | | > The way to do that requires them to know that the key which signed the data belongs to you. This is something S/MIME does and I wouldn't say it doesn't do so well. You can start from mailbox validation and that already beats everything PGP has to offer in terms of ownership validation. If you do identity validation or it's a national PKI issuing the certificate (like in some countries) it's a very strong guarantee of ownership. Coughing baby (PGP) vs hydrogen bomb level of difference. It much more sounds to me like an excuse to use PGP when it doesn't even remotely offer what you want from a replacement. | |
| ▲ | afiori 14 hours ago | parent | prev [-] | | I think it should be mostly ad-hoc methods: if you have a website put your keys in a dedicated page and direct people there If you are in an org there can be whatever kind of centralised repo Add the hashes to your email signature and/or profile bios There might be a nice uniform solution using DNS and derived keys like certificate chains? I am not sure but I think it might not be necessary |
|
|
|