Remix.run Logo
Aurornis a day ago

Many will cheer for any case that hurts Meta without reading the details, but we should be aware that these cases are one of the key reasons why companies are backtracking from features like end-to-end encryption:

> The New Mexico case also raised concerns that allowing teens to use end-to-end encryption on Instagram chats — a privacy measure that blocks anyone other than sender and receiver from viewing a conversation — could make it harder for law enforcement to catch predators. Midway through trial, Meta said it would stop supporting end-to-end-encrypted messaging on Instagram later this year.

The New York case has explicitly gone after their support of end-to-end encryption as a target: https://www.reuters.com/legal/government/meta-executive-warn...

mjevans a day ago | parent | next [-]

The correct nuance here is...

* Classifying accounts as child accounts (moderated by a parent)

* Allowing account moderators to review content in the account that is moderated (including assigning other moderation tools of choice)

In call cases transparency and enabling consumer choice should be the core focus.

Additionally: by default treat everyone online as an adult. Parents that allow their kids online like that without supervision / some setting that the user agent is operated by a child intend to allow their children to interact with strangers. This tends to work out better in more controlled and limited circumstances where the adults involved have the resources to provide suitable supervision.

At the same time, any requirements should apply only to commercial products. Community (gratis / not for profit) efforts presumably reflect the needs of a given community.

itissid 42 minutes ago | parent | next [-]

I think getting the age thing correct is key to get parental classification to work properly(I think now platforms just ask for a birth date which is lame) e.g

> Surveys by Britain’s tech regulator, Ofcom, find that among children aged 10-12, over half use Snapchat, more than 60% TikTok and more than 70% WhatsApp. All three apps have a notional minimum age of 13: https://archive.ph/y3pQO

Once you get the classification correct — and AI cannot it do this — only via community ombudsman/age verifiers, in a privacy first way*, the app stores can easily tell the app devs what accounts are sensitive and filtering should be much more effective.

*Basically once your age is verified by a real human for your device(using device local encryption to verify biometrics) you are set. No kid should be able to bypass and install apps it on devices that their parents hand to them. There will always be black market devices with these apps, but there are ways of beating those to be very minimal by existing tech.

kelseyfrog a day ago | parent | prev [-]

> Classifying accounts as child accounts

It's ok to drive Dad's truck unless he catches you and tells you no.

IAmBroom 11 hours ago | parent [-]

Unfair presentation. What they suggested was more akin to, "Assume someone with keys is an adult, and let them start the truck."

Dad should either know his children would never drive the truck without permission, or keep his keys as safe as his wallet (and if he can't trust his kids with keys, you bet his wallet needs protection).

pylua a day ago | parent | prev | next [-]

I’m actually okay with not letting under age people use e2e. I’m not okay with blocking everyone. I have 2 kids.

fourside a day ago | parent | next [-]

I understand the concern but then to make this available for adults you now have to provide proof of age to companies, which opens up another can of privacy worms.

skybrian a day ago | parent | next [-]

Theoretically we don't actually need proof of age. Websites need to know when the user is attempting to create an account or log in from a child-locked device. Parents need to make sure their kids only have child-locked devices. Vendors need to make sure they don't sell unlocked devices to kids.

seanmcdirmid 42 minutes ago | parent | next [-]

> Theoretically we don't actually need proof of age. Websites need to know when the user is attempting to create an account or log in from a child-locked device. Parents need to make sure their kids only have child-locked devices. Vendors need to make sure they don't sell unlocked devices to kids.

Given how current parental controls work, kids are not getting access if their device is under parental control (the default for open web access is off). So Facebook still won't see any child-locked devices, even before this ruling. My guess is that this ruling applies to parents who aren't making sure their kids get access only via child locked devices.

itissid 33 minutes ago | parent | prev | next [-]

Theoretically only

> Surveys by Britain’s tech regulator, Ofcom, find that among children aged 10-12, over half use Snapchat, more than 60% TikTok and more than 70% WhatsApp. All three apps have a notional minimum age of 13.

https://archive.ph/y3pQO

polyomino a day ago | parent | prev [-]

Children do not want child locked devices and they will find alternatives

itissid 17 minutes ago | parent | next [-]

The issue is not just age verification but also device pinning.

I think the framework here is to have community driven age verifiers( i recall there is an EU effort for digital wallets which besides it's bad parts has some of these good parts) which can verify ages for people and link them to( local biometrically encrypted) devices for pinning. This would be privacy preserving. The only downside is a mandate for all devices have a built-in hardware biometric encryption like a finger/face print so phones can't be just(used) with these apps installed.

The verification part is a job that could be done by all the teachers and coaches and ofc parents. Any one verifying identities would be cryptographically nominated/revoked by a number of more senior members of the community. A prent always get the right to say ok for their kid ofc but so could teachers or legal guardians..

We(legally) need a mandate for smart devices to have local device only biometric verification. The law should be to have these apps follow device app store protocols.

sixsevenrot 13 hours ago | parent | prev | next [-]

As with smoking, alcohol, sex, drugs etc

Children who are smart enough to get access to a given vice without getting caught are more likely to be mature enough to be able to cope with that vice.

cr125rider 6 hours ago | parent [-]

I think we’re going to see how that plays out with gambling.

It seems a bit silly to think security abstinence is the solution.

skybrian a day ago | parent | prev | next [-]

True, it's never going to be 100%, but at least it's a tractable problem for parents. Enough to change what the culture considers "normal," anyway.

IAmBroom 11 hours ago | parent | prev | next [-]

Imperfect solutions are still called "solutions".

kakacik 7 hours ago | parent | prev [-]

Well then don't give them money to do so, its not like phones grow on trees. If you make selling phone/internet device to a minor under certain threshold an illegal act severely punished by law in same way alcohol and cigarettes are, many cases of access are solved. Also, paid internet subscription doesn't grow on the trees even though there are free wifi networks.

All imperfect solutions, but they slice original huge problem into much smaller chunks which are easier to tackle with next approach.

kelseyfrog a day ago | parent | prev [-]

[flagged]

genthree a day ago | parent | next [-]

I believe Zuckerberg has a term for people who willingly break online anonymity because someone with a domain name and website asks them to.

throwaway27727 a day ago | parent | prev [-]

Establishments don't record my data or even take down my name. They take a look at the birthdate and wave me forward.

triceratops a day ago | parent | next [-]

We need a way to do this online.

kelseyfrog a day ago | parent | prev [-]

> Establishments don't record my data or even take down my name.

What are you talking about. Have you really never rented a car before?

Some establishments, as part of their business practice, require identification.

triceratops a day ago | parent [-]

And many don't. Bars, nightclubs, liquor stores, tobacconists, R-rated movies.

kelseyfrog a day ago | parent [-]

We don't see people worried that bars, nightclubs, liquor stores, tobacconists, R-rated movies asking for age verification will slip into requiring names too.

It honestly looks like an emotional panic. People who take seriously slippery slopes aren't to be taken seriously themselves.

Social media is like e-cigarettes in the sense that the shift toward nicotine salts (think Juul) around 2015 resulted in e-cigarettes becoming more dangerous and thus more age-restricted.

It's also like consumer credit cards. Remember that in 1985 Bank of America just mailed out 60,000 unsolicited credit cards to residents of Fresno, CA without application, age verification, or identity check. They just landed in people's mailboxes, including those of minors. Eventually a predatory lending industry developed and we increased the age and ID requirements. My point is that systems can, and do become more dangerous overtime. Not all, but not none.

Algorithmic feeds, online advertising, and attention engineering are the nicotine salts of social media. The product's changed, so should the access.

duskdozer 17 hours ago | parent | next [-]

>We don't see people worried that bars, nightclubs, liquor stores, tobacconists, R-rated movies asking for age verification will slip into requiring names too.

Do we not? Sellers often don't just look at IDs now, they scan them into their system, and naturally, keep and sell your identity info, purchase data, and anything else they have access to.

>Algorithmic feeds, online advertising, and attention engineering are the nicotine salts of social media. The product's changed, so should the access.

This basically makes it clear. The problem is not that children are on social media. The problem is that "social media" has been allowed to become a platform for exploitation and manipulation by their owners. Adults aren't free from this either.

ndriscoll a day ago | parent | prev | next [-]

Digital age verification laws I've read also literally specifically ban recording that information, unlike in person. People were arguing with me that companies would decide they need to retain that info for audit purposes when there are no audit requirements and when it's illegal to store it for any reason.

triceratops a day ago | parent | prev [-]

> People who take seriously slippery slopes aren't to be taken seriously themselves

> Eventually a predatory lending industry developed and we increased the age and ID requirements

I have no idea if you're arguing for or against verification. You dismissed the idea that age verification is a slipper slope to more stringent ID requirements. Then provided an example where the exact opposite happened.

kelseyfrog a day ago | parent [-]

I'm not arguing that social media will get worse, I'm arguing that it has gotten worse. A slippery slope argues that something will happen. I'm pointing out that it has happened. Huge difference.

Even more, my point is that rules, regulations, and requirements adapt when these changes become unbearable. That has happened with social media, therefore a change in rules, regulations, and requirements is deserved.

whatshisface a day ago | parent | prev | next [-]

I'm not comfortable with the idea that children's private messages would be exposed to thousands of social media workers and government employees.

newscracker 21 hours ago | parent | prev | next [-]

In a way, this is like saying that one trusts total strangers in some random large tech company and total strangers in government agencies to read and/or manipulate conversations that kids have. This also paves the way to disallow E2EE for other classes of people based on arbitrary criteria. I don’t believe this is good for society overall.

intended 17 hours ago | parent [-]

The reason we are having this discussion, is because the private route worked up to a point.

Firms have a fiduciary duty to shareholders and profit.

On the other hand, You ultimately decide the rules and goals that operate government organizations, and do not have a profit maximization target.

They aren’t the same tool, and they work for different situations.

The E2EE slippery slope is a different challenge, and for that I have no thoughts

triceratops a day ago | parent | prev | next [-]

I have kids. I don't want creeps and predators spying on their conversations with friends.

pylua a day ago | parent | next [-]

That's true, I didn't consider that

jMyles a day ago | parent | prev [-]

https://web.archive.org/web/20210522003136/https://blog.nucy...

noosphr a day ago | parent | prev | next [-]

You just need to provide the government with your name and address and the name and address of the counter party every time you send an encrypted message.

If you don't support this you're obviously a pedo nazi terrorist.

hsbauauvhabzb a day ago | parent | prev | next [-]

The problem is all these ‘for the children’ arguments contain collateral damage.

vaylian 17 hours ago | parent | next [-]

And the effectiveness for the stated goal is also often questionable.

pylua a day ago | parent | prev | next [-]

It does seem like it could potentially be used to enforce mass surveillance over the people of the United States

simmerup a day ago | parent [-]

Alphabet can grep your emails, Amazon has literal microphones and cameras in most peoples houses

That ship has sailed

pylua a day ago | parent [-]

Yes google analyzes everything you upload to it and if it finds a violation will report to the proper gov agencies.

It is actually terrifying . If you write something out of context or upload an image out of context you can be in big trouble.

intended 17 hours ago | parent | prev [-]

Well, the problem is that the “don’t do it” arguments have children as the collateral damage.

We are at a point where we are picking and choosing collateral damage targets.

usr1106 17 hours ago | parent | prev [-]

There is no reason kids should use so called smart devices, except making certain companies richer. Kids have had a healthy development without such crap for thousands of years. We don't discuss what percentage of alcohol should be allowed in beer and wine for kids.

IAmBroom 11 hours ago | parent [-]

The French (watered wine) and British (shandies) do.

lrvick 5 hours ago | parent | prev | next [-]

Centralized organizations with proprietary software can never offer meaningful end to end encryption because they can just ship an app update to disable or backdoor it at any time.

It is better for them to be forced to turn off the security theater so people that need actual privacy can research alternatives.

ronsor 21 hours ago | parent | prev | next [-]

This is the core issue.

We know that this isn't really going to reduce harm for children, we know Meta is not seriously going to suffer or change, and we know this is going to be used as a cudgel to beat down privacy and increase surveillance.

armada651 16 hours ago | parent [-]

Why is it so important that kids have access to the internet anyway that we're willing to sacrifice both our privacy and freedom of speech rights for it when we already know it's damaging their mental health?

We don't need all this privacy invasion if we just didn't give kids a smartphone with a data plan.

themafia a day ago | parent | prev | next [-]

> Many will cheer for any case that hurts Meta

Absolutely. Particularly where they've been found to be guilty.

> but we should be aware that these cases are one of the key reasons why companies are backtracking from features like end-to-end encryption

Why _social media_ companies are backtracking. I'm extremely nonplussed by this outcome.

> concerns that allowing teens

Yes, because that's what we all had in mind when considering the victims and perpetrators of these crimes.

bitwize a day ago | parent | prev | next [-]

The Clipper chip is coming back.

intended 17 hours ago | parent | prev | next [-]

Rock meet hard place?

Harm to kids is actually happening, and this is always going to be a hot button topic.

E2E is critical for our current ability to communicate online, but will be a lower priority when pitted against child safety.

Fighting the good fight is one thing, fighting for the sake of it, without a plan that addresses the tactical reality is another altogether.

Personally, I think E2E will be defended, but it’s becoming a lightning rod for attention. As if removing encryption will solve the emerging issues.

I suspect providing alternatives to champion, such as privacy preserving ways to verify age, will force a conversation on why E2E needs to go.

bdangubic 21 hours ago | parent | prev | next [-]

This is a good thing for “social” media. If you use any social media app (especially those owned by Meta) you should assume that absolutely everything you do is for full public consumption. Maybe these changes will make everyone stop thinking that anything is private when using “social” media apps.

gzread a day ago | parent | prev [-]

Is it illegal or is it just illegal on general purpose platforms whose focus isn't extreme security?

We all know Meta can still read E2EE chats (otherwise they wouldn't do it) and they're using E2EE as an excuse to avoid liability for the things their platform encourages. Contrast this with something like Signal where the entire point is to be secure.

cristoperb a day ago | parent | next [-]

> We all know Meta can still read E2EE chats

That can't be true, otherwise in what sense is it E2EE?

duskdozer 17 hours ago | parent | next [-]

Well, I've seen services describe having "E2EE" where one end is your computer and the other end is their server, so...

vaylian 17 hours ago | parent | prev | next [-]

The metadata is still unencrypted. That also reveals quite a bit.

gzread a day ago | parent | prev | next [-]

In the sense that calling it E2EE gives people a warm fuzzy feeling and makes people send more sensitive information over the platform.

Has anyone actually audited it?

babelfish a day ago | parent [-]

Probably their auditors? Lying about this would be tantamount to (very serious) securities fraud. Not sure what you're basing on your allegations on besides "trust me bro"

gzread 8 hours ago | parent | next [-]

Why would lying about having E2EE be securities (as in stock market) fraud? Would that make any lie ever told by a corporation equate to stock market fraud?

babelfish 8 hours ago | parent [-]

Yes! As Matt Levine says, “everything is securities fraud”

gzread an hour ago | parent [-]

So if Microsoft tells me upgrading to windows 11 will make my computer better, you think that's securities fraud?

Supermancho a day ago | parent | prev [-]

[dead]

interestpiqued a day ago | parent | prev [-]

I mean you can read it in your app and they're not just stored on your phone. E2E just means in transport from what I understand.

SAI_Peregrinus a day ago | parent [-]

E2EE means end-to-end, where the ends are the participants in the chat. They can read it on your phone, but not on their servers. They need their app to separately transmit the plaintext to their servers to read it.

throwaway173738 a day ago | parent [-]

Which is technically possible.

markdown a day ago | parent | prev [-]

The first two E's in E2EE stand for end. From one end to the other. So no, Meta can't. Or put another way... if they can read those messages, then it's not E2EE.