| ▲ | dagss 14 hours ago |
| Not commenting on this specific law, but I do believe the premise that children should be exposed to everything is wrong, and that the overall view on humans in this post is naive. These days, exposing an immature brain to the raw internet is basically just handing the brain and personality over to be molded by large corporations and algorithms. And humans have never been rational, self-contained actors that self-educate perfectly when exposed to information, converging on an objectively good and constructive worldview. Quite the opposite. Humans develop in relation to one another, increasingly in relation to algorithms, and sometimes become messed up, and sometimes those mess-ups would have been avoidable had relations or exposure been different. In fact I would say you as a parent is not doing your job if you are not trying to make sure a 12 year old isn't pulled into, say, an anorexia rabbit hole. Whether that is best done through making sure exposure doesn't happen, or through exposure and education, will depend on the child and parent (and society) in question. What worked best for a highly rational self-reliant geek teen may simply be a disaster for another human. And what worked for an upper class highly educated family may not work for a poor family with alcoholized parents or working 18 hours a day to make ends meet. And parents are not perfect -- if all parents were perfect, there also would be no alcoholics and drug addicts or poverty or war. But people are imperfect, and it's natural to make laws to mitigate at least the worst effects of that. (Again, haven't read this specific law proposal, but found the worldview of OP a bit naive.) |
|
| ▲ | thunfischtoast 14 hours ago | parent | next [-] |
| > These days, exposing an immature brain to the raw internet is basically just handing the brain and personality over to be molded by large corporations and algorithms. You make the case of todays internet being insuitable for young children.
But has this been different, ever, maybe apart from the very first days of the internet?
While access through phones has reshaped the internet fundamentally, I'd propose that it has always been dangerous. When I was 12, a single wrong click could destroy your machine, or lead to a physical bill being sent to my parents home (which has happened), or lead to most disturbing pictures and videos. So I think it's not the case that we should allow kids completeley unsupervised access (like it always has been), but it's also naive to think that we can regulate our way out of this (on state or household-level, like it always has been). |
| |
| ▲ | theshrike79 13 hours ago | parent | next [-] | | When my generation "accessed the internet", there was a massive dial-up sound and the single family PC was in the living room, visible to everyone. Even later when the computer was in my room, I still had to go look for the creepy shit, it didn't appear in my email inbox. Kids this age browse the internet through algoritmic apps built to maximise engagment in a corner on their bed in their room. Parental controls for most apps and operating systems are a fucking joke. | | |
| ▲ | hellojesus 8 hours ago | parent [-] | | Agreed, but isn't this a parental issue? Why aren't parents moving back to a "shared pc in the living room" model? I absolutely would not allow a kid to have an unregulated smartphone and then further compound the problem at home by allowing them to access it privately and without interruption. Device management enrollment is trivial on iphones. | | |
| ▲ | theshrike79 5 hours ago | parent [-] | | Having a smart phone is required for taking part in society. Monitoring said devices is a lot harder, enrolling to device manager doesn’t let me monitor the content of specific apps |
|
| |
| ▲ | labcomputer 6 hours ago | parent | prev | next [-] | | This feels like an extremely naive take. Even as late as the mid-aughts the internet was mostly nerdy technical information, real people sincerely discussing various topics, and the very worst thing was a little bit of (mostly still-image) porn if you were looking for it. Kids back then weren't targeted by a stream of continuously A/B tested algorithmic content intended to tell them what to think and shape their brains. Overwhelming evidence exists that social media (as it exists today) is bad for the mental health of young people (and probably adults, too, but at least adults have the presence of mind and lack of social pressure to delete Facebook). | |
| ▲ | dagss 13 hours ago | parent | prev [-] | | I think there is a drastic difference between being once off exposed to bad images, and an algorithm making a choice of whether to subtly over time expose the Pokemon-interested child to racist Pokemon videos vs non-racist Pokemon videos on Tiktok. (Or anorexic Pokemon videos, or..) Amount of time spent and repeated exposure being the key. The question is really what kind of human is raised, rather than raw exposure as such. So for that reason things are different IMO than than 20 years ago. Yes, of course some people would fall into internet forum rabbit holes 20 years ago, and papper-letter-friend-induced rabbit holes 100 years ago. But it did help that it was like 5% of the population instead of 95% of the population spending their time there. Regarding your last point, I don't necessarily disagree (again I didn't check up on this law, I care more about the laws in my own country), but I think arguing against the law will go better if one does not display naivety when making the arguments Don't say "it will be better if all kids are exposed to everything early" (it won't), instead say "the medicine will not work and anyway the side-effects are worse than the sickness it intends to cure" (if that is the case). | | |
| ▲ | LinXitoW 12 hours ago | parent [-] | | But the algorithm stuff is bad for everyone, and makes a lot of money, so it's obviously never ever going to be part of any regulation. | | |
| ▲ | labcomputer 6 hours ago | parent | next [-] | | But adults (who have fully developed brains, unlike adolescents) can choose not to engage with the algorithm stuff. | |
| ▲ | dagss 12 hours ago | parent | prev [-] | | Australia banned social media under 16, and many other countries are looking on variations on this. In the US, perhaps not... |
|
|
|
|
| ▲ | eloisant 12 hours ago | parent | prev | next [-] |
| I agree, and I believe too many geeks who are now parents (including the author of the blog post) do not realize that the computers they grew up with, and in particular the Internet they grew up with, is nothing like computers (phones) and the Internet kids have access today. |
| |
| ▲ | intrasight 6 hours ago | parent [-] | | The Grimm fairy tales (1819) are full of graphical violence, child abuse, anti-semitism, and incest. They are much more harmful than anything that I've encountered on the Internet. So why are we discussing internet harms instead of book harms? Because people are fucking stupid. And why are we getting concerned about "sharing private information with random web sites" when that's not the solution being discussed. The solution is a simple handshake: service: Is the person assigned this device old enough to use this service? idp proxy: yes|no |
|
|
| ▲ | nicman23 12 hours ago | parent | prev | next [-] |
| > believe the premise that children should be exposed to everything is wrong imo this is what is wrong with modern parenting. the reality does not care about the child's feelings and if it is old enough to have a screen with internet unattended it is old enough for anything |
| |
| ▲ | gzread 12 hours ago | parent [-] | | I don't understand what you're trying to say in this comment. Can you reword it? | | |
|
|
| ▲ | a456463 9 hours ago | parent | prev | next [-] |
| If you are not perfect, then don't have kids then. If you can't take care of them and nurture them with the attention that they need and rightfully deserve. |
|
| ▲ | FetusP 13 hours ago | parent | prev | next [-] |
| But, should it be the governments responsibility to decide/control how children are raised, and what they are exposed to? Maybe in the future, a governing body will try to age lock dissenting opinions with some crafty verbage |
| |
| ▲ | gzread 12 hours ago | parent [-] | | The CA/CO law only requires the option to enable parental controls on an account, and as the article points out, can be worked around by a sufficiently determined child using something like a virtual machine. This is not really the government deciding how children should be raised. The parent still has the ability to choose to apply the parental controls. It's more like the rule that minors can't buy alcohol in bars - parents can still buy alcohol at the supermarket for their children, and sufficiently determined children can find some other adult to buy it for them. Probably by the time you know how to install a virtual machine, you can handle the unrestricted internet. | | |
| ▲ | Cyph0n 10 hours ago | parent [-] | | The bigger problem is it sets us on a possible path towards completely government-controlled computing devices. The fact that so many countries are pursuing ID requirements online is somewhat of a canary for this whole OS age check thing imo. |
|
|
|
| ▲ | jajuuka 8 hours ago | parent | prev | next [-] |
| I've seen this view applied to things like TikTok and Instagram. Especially with the recent lawsuit. But then when to comes to addressing it most people seem to flip completely and bemoan parenting and internet freedom. It just ends up in a circular pattern of "this is awful, but we shouldn't do anything about it. These companies are poisoning kids, but any attempts to rectify that are infringing on my right to the internet." Makes a lot of conversations around this topic feel entirely pointless. |
|
| ▲ | muyuu 12 hours ago | parent | prev | next [-] |
| ok, that is the argument with merit in favour of shielding kids from the internet - now let's consider how does it look like when the locus of responsibility is governments it's true that kids are vulnerable to certain forms of content on the internet it's also true that adults are vulnerable to certain forms of content on the internet it's also true that governments cannot police "harmful content" on the internet effectively, or even meaningfully, if most people can easily surf the internet pseudonymously it's also very true right now that what's on "social media" is very Sybill-vulnerable, and inordenately so right now with the advent of LLMs what do you think the playbook will look like once there is some sort of tight OS level system that is enforced across the board to certify or verify information about the user? do you think this level of coordination to push for identifying the user at all levels that is happening across the world in a matter of weeks is genuine concern for the kids alone? |
|
| ▲ | jonathanstrange 12 hours ago | parent | prev | next [-] |
| My view is that this must be left entirely to the parents. The only time a government should be allowed to interfere is when there are child abuse or neglect cases against the parents and the children are put under child protective care. It is in my view crazy and irresponsible to allow the government override the parents' decisions about what media their children can consume. It is guaranteed that this power will be abused. |
| |
| ▲ | gzread 12 hours ago | parent [-] | | The CA/CO law is literally the government writing a law that says it shall be left to the parents but the device must give the parents the options they need. | | |
| ▲ | muyuu 11 hours ago | parent | next [-] | | it says that, but the action of hardening devices effectively contradicts what it says to be charitable, let's say that it "enhances" parental controls by taking on some of that parental enforcement at the state level | | |
| ▲ | gzread 11 hours ago | parent [-] | | What action do you mean? | | |
| ▲ | muyuu 11 hours ago | parent [-] | | the action of forcing any sort of verification or certification on devices or operating systems this is taking the parental control largely into their own hands | | |
| ▲ | gzread 10 hours ago | parent [-] | | This law doesn't force any sort of verification or certification, so it's fine then? | | |
| ▲ | hellojesus 8 hours ago | parent | next [-] | | Does it effectively outlaw general computing for minors by requiring account holders to set up accounts for minors where account holders are defined as being 18+? Im honestly not sure; but I could see that being the result of the law and companies like best buy disallowing minors from purchasing hardware with cash for fear of liability. | | | |
| ▲ | muyuu 9 hours ago | parent | prev [-] | | it is obviously enforcement by proxy, trying to pretend otherwise is laughable but then again so is most of the shilling supporting this legislation | | |
| ▲ | gzread 7 hours ago | parent [-] | | What is "enforcement by proxy" and how does it apply to this law? | | |
| ▲ | muyuu 5 hours ago | parent [-] | | it is extremely simple for instance, the government can effectively ban you from saying something they don't want you to say by forcing all companies that may provide any substantial platform to you to implement their code speech that way they have enforced a ban on you by proxy the same way they can verify/certify the id of people totally or partially when they go online, by forcing all vendors who provide the systems that you may use to go online to enforce it for them and this law absolutely does that |
|
|
|
|
|
| |
| ▲ | jonathanstrange 10 hours ago | parent | prev | next [-] | | So these laws state that device makers need to ensure that there is at least one operating system with parental controls that the parents can install? That would be fine for me but AFAIK that's not what these laws state. | |
| ▲ | tstrimple 8 hours ago | parent | prev [-] | | I've obviously read about how bad adult literacy in the US is, but I didn't realize how many "technologists" were impacted by it. The law is short and clear and doesn't involved attestation or age verification. Yet all these "hackers" claim it does just that. The reading comprehensions and critical thinking skills seem to match the national average. | | |
| ▲ | hellojesus 8 hours ago | parent [-] | | I think most people here are extrapolating the intent behind this law, the triviality with which it can be bypassed by minor account holders, and what that means for the future. Once this law is in effect, it will be ineffectual. Minors that current don't know what VMs are, what live booting is, what keyloggers are, etc. will learn immediately once blog posts start circulating about bypass mechanisms. Parents will then go back to the legislature and say the law as-written sucks, and they will demand better laws, but the only way to get better is to force all devices to authenticate with the isp with a gov-issued id/token to prove the account is not a minor. But the only way to prevent even further workarounds like the OS lying is to force hardware based remote attlestation. And that means the death of general computing and the death of any anonymity. | | |
| ▲ | gzread 7 hours ago | parent [-] | | Most laws are ineffectual. Kids can't drink alcohol but they still can; theft is illegal but I still got your car keys; murder is illegal but people still die. In this one, there's no punishment for bypass, just like there's no punishment for a kid who gets alcohol. Unlike the alcohol law this one doesn't even mandate the use of the child protection features - just their existence. You know the simple fix to your problem is to mark VMs as adult only apps, anyway. | | |
| ▲ | hellojesus 5 hours ago | parent [-] | | But what happens when a nefarious actor fills the void and publishes a root-kited VM and marks it as safe for children? These restrictions breed black markets that usually cause even more harm. |
|
|
|
|
|
|
| ▲ | prmoustache 6 hours ago | parent | prev [-] |
| [dead] |