Remix.run Logo
thunfischtoast 14 hours ago

> These days, exposing an immature brain to the raw internet is basically just handing the brain and personality over to be molded by large corporations and algorithms.

You make the case of todays internet being insuitable for young children. But has this been different, ever, maybe apart from the very first days of the internet? While access through phones has reshaped the internet fundamentally, I'd propose that it has always been dangerous. When I was 12, a single wrong click could destroy your machine, or lead to a physical bill being sent to my parents home (which has happened), or lead to most disturbing pictures and videos.

So I think it's not the case that we should allow kids completeley unsupervised access (like it always has been), but it's also naive to think that we can regulate our way out of this (on state or household-level, like it always has been).

theshrike79 13 hours ago | parent | next [-]

When my generation "accessed the internet", there was a massive dial-up sound and the single family PC was in the living room, visible to everyone.

Even later when the computer was in my room, I still had to go look for the creepy shit, it didn't appear in my email inbox.

Kids this age browse the internet through algoritmic apps built to maximise engagment in a corner on their bed in their room. Parental controls for most apps and operating systems are a fucking joke.

hellojesus 8 hours ago | parent [-]

Agreed, but isn't this a parental issue? Why aren't parents moving back to a "shared pc in the living room" model?

I absolutely would not allow a kid to have an unregulated smartphone and then further compound the problem at home by allowing them to access it privately and without interruption. Device management enrollment is trivial on iphones.

theshrike79 5 hours ago | parent [-]

Having a smart phone is required for taking part in society.

Monitoring said devices is a lot harder, enrolling to device manager doesn’t let me monitor the content of specific apps

labcomputer 6 hours ago | parent | prev | next [-]

This feels like an extremely naive take.

Even as late as the mid-aughts the internet was mostly nerdy technical information, real people sincerely discussing various topics, and the very worst thing was a little bit of (mostly still-image) porn if you were looking for it.

Kids back then weren't targeted by a stream of continuously A/B tested algorithmic content intended to tell them what to think and shape their brains. Overwhelming evidence exists that social media (as it exists today) is bad for the mental health of young people (and probably adults, too, but at least adults have the presence of mind and lack of social pressure to delete Facebook).

dagss 13 hours ago | parent | prev [-]

I think there is a drastic difference between being once off exposed to bad images, and an algorithm making a choice of whether to subtly over time expose the Pokemon-interested child to racist Pokemon videos vs non-racist Pokemon videos on Tiktok. (Or anorexic Pokemon videos, or..)

Amount of time spent and repeated exposure being the key.

The question is really what kind of human is raised, rather than raw exposure as such.

So for that reason things are different IMO than than 20 years ago.

Yes, of course some people would fall into internet forum rabbit holes 20 years ago, and papper-letter-friend-induced rabbit holes 100 years ago. But it did help that it was like 5% of the population instead of 95% of the population spending their time there.

Regarding your last point, I don't necessarily disagree (again I didn't check up on this law, I care more about the laws in my own country), but I think arguing against the law will go better if one does not display naivety when making the arguments

Don't say "it will be better if all kids are exposed to everything early" (it won't), instead say "the medicine will not work and anyway the side-effects are worse than the sickness it intends to cure" (if that is the case).

LinXitoW 12 hours ago | parent [-]

But the algorithm stuff is bad for everyone, and makes a lot of money, so it's obviously never ever going to be part of any regulation.

labcomputer 6 hours ago | parent | next [-]

But adults (who have fully developed brains, unlike adolescents) can choose not to engage with the algorithm stuff.

dagss 12 hours ago | parent | prev [-]

Australia banned social media under 16, and many other countries are looking on variations on this.

In the US, perhaps not...