Remix.run Logo
skybrian 13 hours ago

Devices with child locks turned on really shouldn't have access to everything on the Internet. A simple protocol could let cooperating websites know when child locks are on, so they don't show inappropriate content. Whitelisting or blacklisting could handle the rest.

This doesn't mean every device needs to implement child locks. It also shouldn't affect anyone using unlocked devices at all.

AJ007 13 hours ago | parent | next [-]

How does that even begin to make sense?

I want to protect my child from X type of content -- one of many jobs of a parent, but I will trust all content to self report to be child inappropriate? "Inappropriate" is entirely subjective and can not be defined as some sort universal bool -- and that's before you get to the point of actively malicious actors like Meta and Tiktok actively exploiting children for their content farms generation and ad impression factories.

If the user owns and controls their computers -- as they should -- then that subjective content filtering layer belongs there, in the owners control. If its a child's, then the parent owns the device, not the child.

skybrian 12 hours ago | parent [-]

The idea is that society should have some common standards for what's inappropriate for children. For example, parents don't want their kids to buy cigarettes, but also, stores don't want to sell them cigarettes. When there's consensus on this, cooperation is possible. Parents have an easier time when they get cooperation from the rest of society.

But there isn't going to be consensus on everything, so content filters are still needed.

popcornricecake 13 minutes ago | parent | next [-]

It's the internet. There are no borders and there is no mandate to follow any consensus. Stores may not want to sell cigarettes to children, but e-stores safely hosted in some remote country do want to sell them nicotine pouches and vapes. With a protocol that makes age information always available to websites they could hide their intentions from adults while actively targeting children.

iamnothere 8 hours ago | parent | prev [-]

So simple, just get various Christian, Muslim, atheist, traditionalist and progressive, sane and insane parents to all agree on a common set of what is appropriate and inappropriate. And then enforce that on all of their children. Why didn’t I think of that? That should go great.

10 hours ago | parent | prev | next [-]
[deleted]
gzread 12 hours ago | parent | prev [-]

> a simple protocol could let cooperating websites know when child locks are on, so they don't show inappropriate content.

Isn't that literally the California law?

skybrian 12 hours ago | parent [-]

Not which law you mean, but I think there's a distinction between "disallows children under 16 from creating an account" (which apparently requires age verification) and "disallows creating or logging into a social media account from a device with a child lock on." (Which doesn't.)

gzread 12 hours ago | parent [-]

The California law is the one where parents select how old their child is and apps must respect that as a child lock.