Remix.run Logo
duped 4 hours ago

This doesn't address the wider array of age-verification related problems that people want to solve, like social media where age verification is needed to police interactions between users.

jdasdf 4 hours ago | parent | next [-]

Such censorship shouldn't exist in the first place.

Bender 4 hours ago | parent | prev | next [-]

I could be misunderstanding the context but to me that sounds like a moderation issue assuming we even want small children on social media in the first place. There should probably be a dedicated child-safe social media site that limits what communication can take place for small children and has severe punishments for adults pretending to be children for the purposes of grooming.

duped 3 hours ago | parent [-]

Moderation is like law enforcement, it doesn't prevent crimes from happening it just punishes the people they can catch. There exist severe punishments for the kinds of behavior I'm talking about, but unsurprisingly, this does not stop kids from being harmed and it doesn't undo it.

This isn't hypothetical, by the way. There are adults catfishing kids into producing CSAM [0], kidnapping and assaulting minors [1], [2], and in the most extreme case, there's a borderline cult of crazy young adults who do terrorize people for fun [3].

It is a constant game of whackamole by moderators/admins to keep this behavior out of online spaces where kids hang out.

I recognize that this is a "think of the children" argument, but indeed that's the point. The anonymous web was created without thinking about the children, just like how all social media was created without thinking about how it could be used to harm people. Age verification is the smallest step towards mitigating that harm.

Now I disagree very strongly with the laws proposed (and indeed, I've been writing/calling/talking with state reps about this locally, because I don't want my state's bill passed). But the technical challenge needs to address the real problems that legislators are trying to go after.

[0] https://www.justice.gov/usao-wdnc/pr/discord-user-who-catfis...

[1] https://www.nbcnews.com/news/us-news/kidnapping-roblox-rcna2...

[2] https://www.nbcmiami.com/news/local/nebraska-man-charged-wit...

[3] https://www.fbi.gov/contact-us/field-offices/boston/news/ope...

Bender 2 hours ago | parent [-]

I am only interesting in protected the majority of children which I believe my proposal more than covers. There will always be exceptions. Today teens share porn, warez, pirated movies and music with small children in rated-G video games. I am not proposing anything for that. It is up to businesses to detect and block such things.

Point being, there will be a myriad of exceptions. I am not looking to address the exceptions. Those can be a game of whack-a-mole as they are today. I am proposing something that would prevent the vast majority of children from being exposed to the trash we today call social media and of course also porn sites.

trinsic2 2 hours ago | parent | next [-]

Look, please don't sideline/marginalize people by using the "whataboutism" term. Thats being used more and more to silence dialog from people that see problems outside the focus of a specific area. Its important that we see ALL sides of the problem.

Bender an hour ago | parent [-]

Fair enough. Even though I do not perceive it that way I removed it in the event a majority of others have come to this conclusion.

trinsic2 an hour ago | parent [-]

Thank you for understanding. I know sometimes topics can get out of hand with comments about related things, but I this case. We might be better off looking at all the extremities.

duped an hour ago | parent | prev [-]

These aren't exceptions or whataboutism. It's the debate being had on the floors of state legislatures.

> It is up to businesses to detect and block such things.

Which is exactly why age verification legislation is hitting the books. No one (serious) cares about whether kids can download porn and R rated movies. Parental controls already exist if the threat model is preventing access to specific content that is able to report itself as _being_ that content.

Your proposal also doesn't address the other domain that these legislators are targeting, which is addictive content. They define specifically what classifies as an addictive stream and put the onus on service providers to assert that they're not delivering addictive streams of media to kids. An HTTP header isn't enough, because it's not about the content being shown to kids but the design patterns of how it's accessed.

Essentially: age verification isn't about porn. 18+ content stirs the pot a bit with the evangelical crowd but it's really not what people are worried about when it comes to controlling digital media access with age gates.

Bender an hour ago | parent [-]

Your proposal also doesn't address the other domain that these legislators are targeting, which is addictive content.

That sounds simple to me. If a type of content is addictive then require the RTA header.

- Adult content, or possible adult content.

- User contributed or generated content (this covers most of social media)

- Site psychological profiles that are deemed addictive (TikTok and their ilk)

Overall we are describing things that are harmful to the development of the minds of small children. If adults wish to avoid such content they can create a child account on their device for themselves to be excluded from this behavior as well. I use a child account in a couple of popular video games to avoid most of the trash talking and spam. I'm not hiding my age as the games have my debit card information but rather I opt-in to parental controls.

svachalek 4 hours ago | parent | prev [-]

This is assuming children should be on social media at all, which I for one would debate.