Remix.run Logo
khalic 2 days ago

No, saying that e2e encryption makes users _less_ safe is completely dishonest, nothing is fine about this.

The logic of "anything is better than before" is also fallacious.

roncesvalles 2 days ago | parent | next [-]

Depends on your definition of "safe". Imagine an adult DMs a nude photo to a minor (or other kinds of predation).

If it's E2EE, no one except the sender and receiver know about this conversation. You want an MITM in this case to detect/block such things or at least keep record of what's going on for a subpoena.

I agree that every messaging platform in the world shouldn't be MITM'd, but every messaging platform doesn't need to be E2EE'd either.

shakna 2 days ago | parent | next [-]

The receiver has a proven and signed bundle, that they can upload to the abuse report. So the evidence has even stronger weight. They can already decrypt the message, they can still report it.

michaelmior 2 days ago | parent [-]

Yes, but this leaves the only way to identify this behavior as by reporting from a minor. I'm not saying I trust TikTok to only do good things with access to DMs, but I think it's a fair argument in this scenario to say that a platform has a better opportunity to protect minors if messages aren't encrypted.

I'm not saying no E2E messaging apps should exist, but maybe it doesn't need to for minors in social media apps. However, an alternative could be allowing the sharing of the encryption key with a parent so that there is the ability for someone to monitor messages.

danlitt 2 days ago | parent | next [-]

> I think it's a fair argument in this scenario to say that a platform has a better opportunity to protect minors if messages aren't encrypted

Would it be a fair argument to say the police have a better opportunity to prevent crimes if they can enter your house without a warrant? People are paranoid about this sort of thing not because they think law enforcement is more effective when it is constrained. But how easily crimes can be prosecuted is only one dimension of safety.

> However, an alternative could be allowing the sharing of the encryption key with a parent

Right, but this is worlds apart from "sharing the encryption key with a private company", is it not?

michaelmior a day ago | parent | next [-]

> Would it be a fair argument to say the police have a better opportunity to prevent crimes if they can enter your house without a warrant?

This is a false equivalency. I don't have to use TikTok DMs if I want E2EE. I don't have a choice about laws that allow the police to violate my rights. I'm not claiming that all E2EE apps should be banned.

> Right, but this is worlds apart from "sharing the encryption key with a private company", is it not?

Exactly why I suggested that as a possible alternative.

danlitt a day ago | parent [-]

> This is a false equivalency.

I'm not making an equivalency. I'm just trying to get you to think how something that is at surface level true is not necessarily a "fair argument".

> I don't have to use TikTok DMs if I want E2EE.

I don't know why you think this is a convincing argument. It is currently illegal to tap people's phone lines, but when phones were invented it obviously was not illegal. It became illegal in part because people had a reasonable expectation of privacy when using the phone. They also have a reasonable expectation of privacy when using TikTok DMs - that's why people call them "private messages" so often!

> Exactly why I suggested that as a possible alternative.

My point is that you are offering these as alternatives when they are profoundly different proposals. It is like me saying I am pro forced sterilization and then offering as an alternative "we could just only allow it when people ask for it". That's a completely different thing! Having autonomy over your online life as a family rather than necessarily as an individual is totally ok. Surrendering that autonomy is not.

InsomniacL 2 days ago | parent | prev | next [-]

> Would it be a fair argument to say the police have a better opportunity to prevent crimes if they can enter your house without a warrant?

Police can access your home with a warrant.

Police cannot access your E2EE DMs with a warrant.

danlitt 2 days ago | parent | next [-]

Not answering my question!

> Police cannot access your E2EE DMs with a warrant.

They can and do, regularly. What they can't do is prevent you from deleting your DMs if you know you're under investigation and likely to be caught. But refusing to give up encryption keys and supiciously empty chat histories with a valid warrant is very good evidence of a crime in itself.

They also can't prevent you from flushing drugs down the toilet, but somehow people are still convicted for drug-related crimes all the time. So - yes, obviously, the police could prosecute more crimes if we gave up this protection. That's how limitations on police power work.

NoahZuniga a day ago | parent | next [-]

> What they can't do is prevent you from deleting your DMs if you know you're under investigation and likely to be caught

If you are pretty confident your under investigation then this is might be Obstruction of Justice and that's pretty illegal.

Tadpole9181 a day ago | parent | prev [-]

> But refusing to give up encryption keys and supiciously empty chat histories with a valid warrant is very good evidence of a crime in itself.

Uh, it absolutely isn't? WTF dystopian idea is this?

danlitt a day ago | parent [-]

It certainly can be - destruction of evidence is a crime. If they can prove you destroyed evidence, even if they can't prove that the destroyed evidence incriminates you, that's criminal behaviour. For instance if it's known by some other means you have a conversation history with person X, but not whether that conversation history is incriminating, and then when your phone is searched the conversation history is completely missing, that is strong evidence of a crime.

allreduce 2 days ago | parent | prev | next [-]

And they shouldn't be able to. Police accessing DMs is more like "listening to every conversation you ever had in your house (and outside)" than "entering your house".

cucumber3732842 2 days ago | parent | prev [-]

>Police cannot access your E2EE DMs with a warrant.

Well the kind of can if they nab your cell phone or other device that has a valid access token.

I think it's kind of analogous to the police getting at one's safe. You might have removed the contents before they got there but that's your prerogative.

I think this results in acceptable tradeoffs.

gzread 2 days ago | parent | prev [-]

Yes, that is a fair argument and most countries allow the use of surveillance cameras in public for this reason.

danlitt a day ago | parent [-]

in public is the operative word (and surveillance cameras in public are extremely recent and very controversial, so not as strong an argument as you might be thinking)

EmbarrassedHelp a day ago | parent | prev | next [-]

> I'm not saying no E2E messaging apps should exist, but maybe it doesn't need to for minors in social media apps. However, an alternative could be allowing the sharing of the encryption key with a parent so that there is the ability for someone to monitor messages.

The problem with that idea, that you are implying E2E should require age verification. Everyone should have access to secure end to end encryption.

hogwasher a day ago | parent | prev [-]

Are you suggesting all messaged photos should be scanned, and potentially viewed by humans, in case it depicts a nude minor? Because no matter how you do that, that would result in false positives, and either unfair auto-bans and erroneous reports to law enforcement (so no human views the images), or human employees viewing other adults' consensual nudes that were meant to be private. Or it would result in adult employees viewing nudes sent from one minor to another minor, which would also be a major breach of those minors' privacy.

There is a program whereby police can generate hashes based on CSAM images, and then those hashes can be automatically compared against the hashes of uploaded photos on websites, so as to identify known CSAM images without any investigator having to actually view the CSAM and further infringe on the victim's privacy. But that only works vs. already known images, and can be done automatically whenever an image is uploaded, prior to encryption. The encryption doesn't prevent it.

Point being, disallowing encryption sacrifices a lot, while potentially not even being that useful for catching child abusers in practice.

I'm sure some offenders could be caught this way, but it would also cause so many problems itself.

michaelmior a day ago | parent [-]

> Are you suggesting all messaged photos should be scanned, and potentially viewed by humans, in case it depicts a nude minor?

No, I was not suggesting that.

gzread 2 days ago | parent | prev | next [-]

SimpleX handles this by sending the decryption keys when the receiver reports the message.

khalic 2 days ago | parent | prev | next [-]

Keeping children safe and prosecuting are too different concepts, only vaguely related. So no, being able to track pdfs doesn't make children safer. What keeps them safe is teaching them safe communication habits and keeping them away from things like Tiktok.

We shouldn't make the world a worse place for every one because some parents can't take care of their children.

cucumber3732842 2 days ago | parent [-]

>Keeping children safe and prosecuting are too different concepts, only vaguely related.

See also: That time the FBI took over a CSAM site and kept it running so they could nab a bunch of users.

Ajedi32 2 days ago | parent [-]

Not necessarily saying what they did was right, but I think there's a strong utilitarian argument to be made that what they did in that case was, in fact, the best way to keep children safe.

What's more dangerous? CSAM on the internet? Or actual child predators running loose?

cucumber3732842 2 days ago | parent [-]

That stuff spreads and re-spreads just like anything else people download off the internet. There's a pretty strong argument for shutting it down right away. IIRC most users were outside jurisdiction.

integralid 2 days ago | parent [-]

Even if one more person was prosecuted it was worth it. If you shut down an illegal website a new one will show up a month later, with the same people involved, and you achieved nothing.

roughly 2 days ago | parent | prev | next [-]

What was the rate of child exploitation in the GDR?

kgwxd 2 days ago | parent | prev | next [-]

Ugh. The kids aren't even safe from the people making, and enforcing laws. This argument should be long over for anyone with eyes or ears.

philipallstar 2 days ago | parent | prev [-]

Imagine Hamas are your government and want to figure out who's gay. You don't want a MITM in case they can do this.

Pick your definition of safe.

trashb 2 days ago | parent | next [-]

In that case don't use Tiktok dm's to discuss your sexuality. I think it is strange that people feel like they have to be able to talk on sensitive topics over every interface they can get their hands on.

Similarly in "traditional" media you may not want to discuss such private conversation on a radio broadcast. Perhaps you would rather discuss it on the phone or over snail mail as there is more of an expectation of privacy on those medium.

roughly 2 days ago | parent | next [-]

Right, but it currently isn't a sensitive topic - homosexuality is, as of 2026, broadly legal in the United States. That's a relatively new state of affairs, historically speaking, and one which Afghanistan shared as recently as 2021.

philipallstar 2 days ago | parent | prev | next [-]

I'm commenting in the context of the conversation, not in a vacuum. You could just as (in fact, much more) easily say that children shouldn't be on apps with private messaging enabled. That would help a lot more, and then we could keep e2ee.

danlitt 2 days ago | parent | prev [-]

> there is more of an expectation of privacy on those medium

What does the "p" in "pm" stand for?

trashb 2 days ago | parent | next [-]

excuse me, I confused "Private messages" (pm) for "Direct messages" (dm).

I will update above

danlitt 2 days ago | parent [-]

I don't think you confused anything, except for the terminology the platform uses. There is an obvious expectation of privacy when sending direct messages!

sleepybrett 2 days ago | parent [-]

Hasn't been true ANYTIME IN HISTORY. Hell it was well understood even by children that no conversation you had on the telephone was truly private. That's why cyphers were invented.

danlitt a day ago | parent [-]

What are you talking about? It is illegal to tap people's phone lines or to interfere with mail. Are you saying people don't have a reasonable expectation of privacy even when it's illegal to be spied on?

sleepybrett 16 hours ago | parent [-]

'Illegal' doesn't really mean anything in this, or any other, day and age when you are talking about the very rich, the very powerful, or the state.

The good thing about e2ee is that it probably makes the list of those with the ability to decrypt things encrypted e2e somewhat smaller. Fact is hacking can get to those keys. (i.e. state actor zero-click exploits your phone they are going to be able to get your private key and the messages in memory)

danlitt 15 hours ago | parent [-]

> 'Illegal' doesn't really mean anything in this

This is a thread arguing about what the law should be.

> Fact is hacking can get to those keys.

Everything made by humans is fallible.

gzread 2 days ago | parent | prev [-]

it stands for "not a public timeline post"

danlitt 2 days ago | parent [-]

It should be obvious from how contrived your wording is that nobody thinks of them this way.

miki123211 2 days ago | parent | prev [-]

This is fine if you have TLS encryption and the platform is not local.

Sure, they can fabricate some evidence and get access to your messages, in which case, valid point.

derbOac a day ago | parent | prev | next [-]

It's a kind of Trojan horse propaganda in my opinion.

Users get used to the argument with TikTok and then apply it to other platforms.

Put it this way: why wouldn't those same arguments apply to any platform (if you believed them)?

fendy3002 2 days ago | parent | prev | next [-]

well having no e2e encryption is safer than having a half-baked e2e encryption that have backdoor and can be decrypted by the provider.

and for tiktok's stance, I think they just don't want to get involved with the Chinese government related with encryption (and give false sense of privacy to user)

miki123211 2 days ago | parent | prev [-]

It makes certain users less safe in certain situations.

E2E makes political activists and anti-chinese dissidents safer, at the cost of making children less safe. Whether this is a worthwhile tradeoff is a political, not technical decision, but if we claim that there are any absolutes here, we just make sure that we'll never be taken seriously by anybody who matters.

khalic 2 days ago | parent [-]

Claiming e2e makes children less safe is flat out dishonest. And the irony of you criticising “absolutes” after trying to pass one is just delicious.

gzread 2 days ago | parent [-]

What are children at risk of, when E2EE is used?

What are children at risk of, when E2EE is not used?

roughly 2 days ago | parent | next [-]

> What are children at risk of, when E2EE is used?

Potential exposure to abusive adults.

> What are children at risk of, when E2EE is not used?

State-sanctioned violence.

reactordev 2 days ago | parent | prev [-]

This is the argument they can’t have…