| ▲ | Traster 7 hours ago |
| I think this is... fine? Am I just totally naive. I think it's fine to say "You don't really have privacy on this app" - as long as there are relatively good options of apps that do have privacy (and I think there are). TikTok is really a public by default type of social media, there's not much idea of mutual following or closed groups. So sure, you don't have privacy on tiktok, if you want it you can move to snapchat or signal or whatever platform of your choice. Like, it's literally a platform that was run under the watchful eye of the CCP, and now the US version is some kleptocratic nightmare, so I just don't see the point in expecting some sort of principled stance out of them. In some ways I think it's worse for places like Facebook to "care about privacy" and use E2EE but then massively under-resource policing of CSAM on their platform. If you're going to embrace 'privacy' I do think it's on you to also then put additional resources into tackling the downsides of that. |
|
| ▲ | londons_explore 7 hours ago | parent | next [-] |
| Tiktok has private messaging, and it is used by hundreds of millions of people. IMO no consumer service should have private 1:1 messaging without e2e. Either only do public messaging (ie. Like a forum), or implement e2e. |
| |
| ▲ | RobotToaster 6 hours ago | parent | next [-] | | Tiktok has direct messages, they don't even call them private. It's better that they're honest about this, nobody should believe for a second that WhatsApp or FB messages are truly E2EE. DM on social media shouldn't be used for anything remotely private. It's a convenience feature, nothing more. | | |
| ▲ | throw0101c 5 hours ago | parent | next [-] | | > Tiktok has direct messages, they don't even call them private. It may not be called that, but what are users expecting? Some folks may later be surprised when a warrant gets issued (e.g., from a divorce judge). | | |
| ▲ | giancarlostoro 5 hours ago | parent [-] | | If you are a grown adult and dont do research on “messaging apps” (which Tik Tok is not) then thats really on you. | | |
| ▲ | foobarchu 39 minutes ago | parent | next [-] | | This viewpoint isn't a slippery slope, it's a runaway train. "You moved into a neighborhood with lead pipes? That's on you, should have done more research"
"Your vitamins contained undisclosed allergens? You're an adult, and it didn't say it DIDN'T contain those"
"Passwords stolen because your provider stored them in plaintext? They never claimed to store them securely, so it's really on you" | |
| ▲ | oarsinsync 5 hours ago | parent | prev [-] | | If you are a grown adult and don't do research on "<insert any topic that could have a material negative impact on your life, but that is not currently on your radar as being a topic that could have a material negative impact on your life>" then that's really on you. Unfortunately, this doesn't scale. | | |
|
| |
| ▲ | throwaway290 5 hours ago | parent | prev [-] | | > nobody should believe for a second that WhatsApp or FB messages are truly E2EE That's interesting. You think all firms that audited WhatsApp and Signal protocol used by WhatsApp and all programmers who worked there for decades and can see a lie and leak if it was true are all crooks? valid opinion I guess, but I won't call it "no one should believe for a second (curious you didn't mention Telegram, it is actually marketed as secure and e2e and it has completely gimped "secret chats" that are off by default and used by like almost nobody.) | | |
| ▲ | giancarlostoro 5 hours ago | parent [-] | | I forget if its WhatsApp that technically lets you sync chats in unencrypted form to iCloud which is the “loophole” around this, though you can lockdown your iCloud even tighter, not sure it Apple can do much if you fully lock down your iCloud, not sure if this has been legally tested? Its not a very advertised feature its just a setting. | | |
| ▲ | oarsinsync 5 hours ago | parent | next [-] | | WhatsApp iPhone syncs to iCloud unencrypted by default[1]. iMessage also syncs to iCloud unencrypted by default[2]. [1] Depends on you paying for iCloud storage, so that you have space for a full phone backup to occur. [2] Might be "free" with "iMessage in iCloud", an option to enable separately. | | |
| ▲ | throwaway290 4 hours ago | parent [-] | | > WhatsApp iPhone syncs to iCloud unencrypted by default[1]. Not true. You must choose to enable it or not when you set up new phone. On mine it does not back up | | |
| ▲ | monooso 3 hours ago | parent [-] | | If you must "choose to enable" encryption, that implies it's off by default. If so, GP's statement is accurate. | | |
|
| |
| ▲ | gzread 5 hours ago | parent | prev | next [-] | | The Android version syncs all your chat logs to Google Drive without encryption by default. That's the backdoor. | |
| ▲ | throwaway290 5 hours ago | parent | prev [-] | | Right now it got a switch to enable e2e for backups, but yeah I think default backup is probably a workaround... |
|
|
| |
| ▲ | trashb 6 hours ago | parent | prev | next [-] | | In my experience most forums have private messaging. Additionally I think it is fine to say "we don't support e2ee". I prefer honesty to a bad (leaky) e2ee implementation, at least the user can make an informed choice. | | |
| ▲ | Ekaros 6 hours ago | parent | next [-] | | I agree. At least take of "Yes messages are stored on our servers" is honest. And if they are accessed by anything else than limited subpoena is policy or legal issue. | |
| ▲ | cucumber3732842 4 hours ago | parent | prev [-] | | >In my experience most forums have private messaging. Yeah but it's kind of accepted that the forum owner could read it all if they so chose. Maybe this is a hold over from back in the old days when encryption was nowhere near default during which forums arose. |
| |
| ▲ | Bender an hour ago | parent | prev | next [-] | | Adding that private self hosted forums can permit uploads of encrypted files, encrypted with a pre-shared secret or a secret shared over a private self hosted Mumble voice chat server. | |
| ▲ | tuwtuwtuwtuw 5 hours ago | parent | prev | next [-] | | The email protocols would like to have a chat with you. | | |
| ▲ | kgwxd 5 hours ago | parent [-] | | You can bring your own encryption to that, and bring your own client to automate it. | | |
| ▲ | em-bee 4 hours ago | parent [-] | | you can encrypt the content but not the metadata, not even the subject unless you use a customized client that encodes it (like deltachat which doesn't use a subject at all), but then you still have your email address exposed. for all intents and purposes email is not e2ee. | | |
| ▲ | Bender 2 hours ago | parent [-] | | Email encryption for most people is sufficient even if the metadata is exposed. One can simply state in their email encryption "Bing Bing Bong" or "Why did you not put the trash out?" which might mean to the recipient :: "check the second SFTP server" or "let the cat outside" or "Jump on my private Mumble chat server" or "Get on my private self hosted IRC server". The email message need not be encrypted for that matter. The intended payload can be in an header-less encrypted file on a throw-away SFTP server in the tmpfs ram disk. |
|
|
| |
| ▲ | DoneWithAllThat 4 hours ago | parent | prev [-] | | And yet virtually all consumer services with 1:1 messaging lacks e2e. This is a bit of a quixotic position to take. |
|
|
| ▲ | khalic 6 hours ago | parent | prev | next [-] |
| No, saying that e2e encryption makes users _less_ safe is completely dishonest, nothing is fine about this. The logic of "anything is better than before" is also fallacious. |
| |
| ▲ | roncesvalles 6 hours ago | parent | next [-] | | Depends on your definition of "safe". Imagine an adult DMs a nude photo to a minor (or other kinds of predation). If it's E2EE, no one except the sender and receiver know about this conversation. You want an MITM in this case to detect/block such things or at least keep record of what's going on for a subpoena. I agree that every messaging platform in the world shouldn't be MITM'd, but every messaging platform doesn't need to be E2EE'd either. | | |
| ▲ | shakna 6 hours ago | parent | next [-] | | The receiver has a proven and signed bundle, that they can upload to the abuse report. So the evidence has even stronger weight. They can already decrypt the message, they can still report it. | | |
| ▲ | michaelmior 6 hours ago | parent [-] | | Yes, but this leaves the only way to identify this behavior as by reporting from a minor. I'm not saying I trust TikTok to only do good things with access to DMs, but I think it's a fair argument in this scenario to say that a platform has a better opportunity to protect minors if messages aren't encrypted. I'm not saying no E2E messaging apps should exist, but maybe it doesn't need to for minors in social media apps. However, an alternative could be allowing the sharing of the encryption key with a parent so that there is the ability for someone to monitor messages. | | |
| ▲ | danlitt 6 hours ago | parent [-] | | > I think it's a fair argument in this scenario to say that a platform has a better opportunity to protect minors if messages aren't encrypted Would it be a fair argument to say the police have a better opportunity to prevent crimes if they can enter your house without a warrant? People are paranoid about this sort of thing not because they think law enforcement is more effective when it is constrained. But how easily crimes can be prosecuted is only one dimension of safety. > However, an alternative could be allowing the sharing of the encryption key with a parent Right, but this is worlds apart from "sharing the encryption key with a private company", is it not? | | |
| ▲ | InsomniacL 5 hours ago | parent | next [-] | | > Would it be a fair argument to say the police have a better opportunity to prevent crimes if they can enter your house without a warrant? Police can access your home with a warrant. Police cannot access your E2EE DMs with a warrant. | | |
| ▲ | allreduce 4 hours ago | parent | next [-] | | And they shouldn't be able to. Police accessing DMs is more like "listening to every conversation you ever had in your house (and outside)" than "entering your house". | |
| ▲ | cucumber3732842 4 hours ago | parent | prev [-] | | >Police cannot access your E2EE DMs with a warrant. Well the kind of can if they nab your cell phone or other device that has a valid access token. I think it's kind of analogous to the police getting at one's safe. You might have removed the contents before they got there but that's your prerogative. I think this results in acceptable tradeoffs. |
| |
| ▲ | gzread 5 hours ago | parent | prev [-] | | Yes, that is a fair argument and most countries allow the use of surveillance cameras in public for this reason. |
|
|
| |
| ▲ | khalic 6 hours ago | parent | prev | next [-] | | Keeping children safe and prosecuting are too different concepts, only vaguely related. So no, being able to track pdfs doesn't make children safer. What keeps them safe is teaching them safe communication habits and keeping them away from things like Tiktok. We shouldn't make the world a worse place for every one because some parents can't take care of their children. | | |
| ▲ | cucumber3732842 4 hours ago | parent [-] | | >Keeping children safe and prosecuting are too different concepts, only vaguely related. See also: That time the FBI took over a CSAM site and kept it running so they could nab a bunch of users. | | |
| ▲ | Ajedi32 2 hours ago | parent [-] | | Not necessarily saying what they did was right, but I think there's a strong utilitarian argument to be made that what they did in that case was, in fact, the best way to keep children safe. What's more dangerous? CSAM on the internet? Or actual child predators running loose? | | |
| ▲ | cucumber3732842 2 hours ago | parent [-] | | That stuff spreads and re-spreads just like anything else people download off the internet. There's a pretty strong argument for shutting it down right away. IIRC most users were outside jurisdiction. | | |
| ▲ | integralid an hour ago | parent [-] | | Even if one more person was prosecuted it was worth it. If you shut down an illegal website a new one will show up a month later, with the same people involved, and you achieved nothing. |
|
|
|
| |
| ▲ | gzread 5 hours ago | parent | prev | next [-] | | SimpleX handles this by sending the decryption keys when the receiver reports the message. | |
| ▲ | kgwxd 5 hours ago | parent | prev | next [-] | | Ugh. The kids aren't even safe from the people making, and enforcing laws. This argument should be long over for anyone with eyes or ears. | |
| ▲ | philipallstar 6 hours ago | parent | prev [-] | | Imagine Hamas are your government and want to figure out who's gay. You don't want a MITM in case they can do this. Pick your definition of safe. | | |
| ▲ | trashb 6 hours ago | parent | next [-] | | In that case don't use Tiktok dm's to discuss your sexuality. I think it is strange that people feel like they have to be able to talk on sensitive topics over every interface they can get their hands on. Similarly in "traditional" media you may not want to discuss such private conversation on a radio broadcast. Perhaps you would rather discuss it on the phone or over snail mail as there is more of an expectation of privacy on those medium. | | |
| ▲ | philipallstar 4 hours ago | parent | next [-] | | I'm commenting in the context of the conversation, not in a vacuum. You could just as (in fact, much more) easily say that children shouldn't be on apps with private messaging enabled. That would help a lot more, and then we could keep e2ee. | |
| ▲ | danlitt 6 hours ago | parent | prev [-] | | > there is more of an expectation of privacy on those medium What does the "p" in "pm" stand for? | | |
| ▲ | trashb 5 hours ago | parent | next [-] | | excuse me, I confused "Private messages" (pm) for "Direct messages" (dm). I will update above | |
| ▲ | gzread 5 hours ago | parent | prev [-] | | it stands for "not a public timeline post" |
|
| |
| ▲ | miki123211 6 hours ago | parent | prev [-] | | This is fine if you have TLS encryption and the platform is not local. Sure, they can fabricate some evidence and get access to your messages, in which case, valid point. |
|
| |
| ▲ | miki123211 6 hours ago | parent | prev | next [-] | | It makes certain users less safe in certain situations. E2E makes political activists and anti-chinese dissidents safer, at the cost of making children less safe. Whether this is a worthwhile tradeoff is a political, not technical decision, but if we claim that there are any absolutes here, we just make sure that we'll never be taken seriously by anybody who matters. | | |
| ▲ | khalic 5 hours ago | parent [-] | | Claiming e2e makes children less safe is flat out dishonest. And the irony of you criticising “absolutes” after trying to pass one is just delicious. | | |
| ▲ | gzread 5 hours ago | parent [-] | | What are children at risk of, when E2EE is used? What are children at risk of, when E2EE is not used? | | |
|
| |
| ▲ | fendy3002 5 hours ago | parent | prev [-] | | well having no e2e encryption is safer than having a half-baked e2e encryption that have backdoor and can be decrypted by the provider. and for tiktok's stance, I think they just don't want to get involved with the Chinese government related with encryption (and give false sense of privacy to user) |
|
|
| ▲ | jmull 4 hours ago | parent | prev | next [-] |
| It might be fine if they presented an honest choice. They are lying straight off though... police and safety team don't read messages only "if they needed to" to keep people safe. They do so for a large variety of other reasons, such as suppressing political dissent and asserting domination and control. I don't think we can expect most people to understand TikTok's BS here either. I notice even a skeptic like you is uncritically echoing the dubious conflation of privacy and CSAM. |
| |
| ▲ | hobs 3 hours ago | parent [-] | | Anyone who doubts the requirement for e2e messaging should not be considered a skeptic, they are fully buying into whatever narrative LEO would like you to believe. |
|
|
| ▲ | mrexcess 3 hours ago | parent | prev | next [-] |
| >I think it's fine to say "You don't really have privacy on this app" Disagree. To analogize why: privacy isn't heated seats, *its seat belts*. Comfort features and preferences are fine to tailor to your customers and your business model. Jaguar targets a different market than Ford, and that's just fine. Safety features should be non-negotiable for all. Both Jaguar and Ford drivers merit the utmost protection against injury in crashes. Likewise, all applications that offer user messaging functionality should offer non-defective, non-harmful versions of it. To do that, e2e privacy is absolutely necessary. >I just don't see the point in expecting some sort of principled stance out of them. This is the defeatism that adds momentum to a downhill trajectory. Exactly the opposite approach arrests the slide - users expecting their applications and providers to behave in principled ways, and punishing those who do not, are what keeps principles alive. Failing to expect lawful and upright behavior out of those you depend on, be they political leaders or software solutions providers, guarantees that tomorrow's behavior will be less lawful and upright than yesterday's. Stop writing these people a pass for this horrible behavior, and start holding them unreasonably accountable for it, then we'll see behavior start to change in the direction that we mostly all agree that it needs to. The most effective protests against internet censorship came from massive grass roots movements, with users drawing a line in the sand that they will not tolerate further impositions on their freedom. >In some ways I think it's worse for places like Facebook to "care about privacy" and use E2EE but then massively under-resource policing of CSAM on their platform. The irony is so manifest of billions of people having their privacy stripped by politicians and business elites in the name of protecting our children, while those politicians and business elites conspire en masse to prey on and sex traffick our children. If these forces actually took those concerns seriously, rather than sensing them as an opportunity to push ulterior motives, they'd be eating each other alive, right now. Half of DC, half of Hollywood, and at least a tenth of most major college administrations would ALL be at the docket. |
| |
| ▲ | Traster 2 hours ago | parent [-] | | Tesla doesn't have parking sensors. They're a safety feature. There's lots of safety features in cars that are optional, we've got an entire rating system for the safety of cars. We're talking about an app that's controlled by the CCP, I do expect them to take a principled stance - stances like Taiwan is a part of China and you can't be openly critical of the leader of the party. They don't have the same principles as you. You can force them to put in E2EE, but you can't force them to be honest about it or competent about it. I would rather know what we're getting than to push them to lie. This is the same thing as the OpenAI/Anthropic thing. You've got Anthropic taking a principled stance and getting pain for it, and you've got OpenAI claiming to take the same stance, but somehow agreeing to the terms of the DoW. Do you think it's more likely that Anthropic carelessly caused themselves massive trouble, or do you think OpenAI is claiming to have got the concessions that clearly won't work in practice. I think it's naive to think the former. | | |
| ▲ | mrexcess 35 minutes ago | parent [-] | | >We're talking about an app that's controlled by the CCP, I do expect them to take a principled stance In the area of large scale internet service providers, who do you expect to take a principled stance, and why do you expect them to take it? If the answer is, "nobody", then why keep singling out China? And if the answer isn't "nobody", then how do we apply the same pressures and principles to TikTok and other platforms that offer messaging? This isn't some abstract concern. We know that WESTERN journalists, activists, and others have been murdered in acts of transnational repression that either began or were focused and abetted by communications surveillance aimed toward political dissidence. It seems incredibly naive to believe that current Western political and military leadership could ever be dissuaded from taking effective action (and such surveillance and repression campaigns certainly are effective) by moral qualms unsupported by strong checks and balances of accountability. In other words - this sort of repression most likely continues happening to journalists, activists, human rights lawyers, and other political dissidents, in our society, today. Enabled by the refusal of our service providers to protect us, their users. It seems incredibly naive - civilization threateningly so - to write a pass to anyone, let alone Larry Ellison, for opting to deliberately expose "his" users to this risk. Nothing is OK about this dereliction of responsibility towards them. |
|
|
|
| ▲ | dfxm12 3 hours ago | parent | prev [-] |
| Trying to gaslight the public into thinking end to end encryption makes users less safe is not fine. |