| ▲ | duped 3 hours ago | |
These aren't exceptions or whataboutism. It's the debate being had on the floors of state legislatures. > It is up to businesses to detect and block such things. Which is exactly why age verification legislation is hitting the books. No one (serious) cares about whether kids can download porn and R rated movies. Parental controls already exist if the threat model is preventing access to specific content that is able to report itself as _being_ that content. Your proposal also doesn't address the other domain that these legislators are targeting, which is addictive content. They define specifically what classifies as an addictive stream and put the onus on service providers to assert that they're not delivering addictive streams of media to kids. An HTTP header isn't enough, because it's not about the content being shown to kids but the design patterns of how it's accessed. Essentially: age verification isn't about porn. 18+ content stirs the pot a bit with the evangelical crowd but it's really not what people are worried about when it comes to controlling digital media access with age gates. | ||
| ▲ | Bender 2 hours ago | parent [-] | |
Your proposal also doesn't address the other domain that these legislators are targeting, which is addictive content. That sounds simple to me. If a type of content is addictive then require the RTA header. - Adult content, or possible adult content. - User contributed or generated content (this covers most of social media) - Site psychological profiles that are deemed addictive (TikTok and their ilk) Overall we are describing things that are harmful to the development of the minds of small children. If adults wish to avoid such content they can create a child account on their device for themselves to be excluded from this behavior as well. I use a child account in a couple of popular video games to avoid most of the trash talking and spam. I'm not hiding my age as the games have my debit card information but rather I opt-in to parental controls. | ||