| ▲ | arn3n 5 hours ago | |
Parents are competing with multi-trillion dollar companies who have invested untold amounts of cash and resources into making their content addictive. When parents try to help their children, it's an uphill battle -- every platform that has kids on it also tends to have porn, or violence, or other things, as these platform generally have disappointingly ineffective moderation. Most parents turn to age verification because it's the only way they can think of to compete with the likes of Meta or ByteDance, but the issue is that these platforms shouldn't have this content to begin with. Platforms should be smaller -- the same site shouldn't be serving both pornography and my school district's announcement page and my friend's travel pictures. Large platforms are turning their unwillingness to moderate into legal and privacy issues, when in fact it should simply be a matter of "These platforms have adult content, and these ones don't". Then, parents can much more easily ban specific platforms and topics. Right now there's no levers to pull or adjust, and parent s have their hands tied. You can't take kids of Instagram or TikTok -- they will lose their friends. I hate the fact that the "keep up with my extended family" platform is the same as the "brainrot and addiction" one. The platforms need to be small enough that parents actually have choices on what to let in and what not to. Until either platforms are broken up via. antitrust or until the burden of moderation is on the company, we're going to keep getting privacy-infringing solutions. If you support privacy, you should support antitrust, else we're going to be seeing these same bills again and again and again until parents can effectively protect their children. | ||