| ▲ | Aurornis 5 hours ago |
| > The only way to outlaw Meta’s dangerous and egregious behavior is to pass legislation, like the Kids Online Safety Act Just last week there was uproar because Discord was going to require age verification to join adult themed servers and bypass content filters. This is how people are getting baited into inviting these restrictions and regulations into their services: By believing it’s necessary to hurt their enemies like Mark Zuckerberg combined with “think of the children”. It’s still sad to these calls for extensive regulation and oversight getting upvoted so much on Hacker News. Every time you see someone calling for regulation for kids online, remember that the only way to tell kids and adults apart is to force everyone to go through age verification. Before you start thinking that you don’t care because you don’t use social media, remember that you are reading this on a social media site. The laws aren’t going to care about whether or not you think Hacker News qualifies as social media. |
|
| ▲ | raincole 5 hours ago | parent | next [-] |
| I wonder if one day when people hear "censored internet" the first country that comes to mind will be a western one (probably the UK, but the US is not off the table either) instead of China. |
| |
| ▲ | Aurornis 3 hours ago | parent | next [-] | | Why would it have to be a single country or hemisphere? If it's happening globally we'll stop thinking of it as a regional think and start thinking of it as a global problem. It is weird to see all of these HN comments demanding such regulations and the continued belief that it won't impact us, it will only impact sites we don't like. Even after the Discord fallout from last week. | |
| ▲ | nephihaha 5 hours ago | parent | prev [-] | | It already does for me because I live here. Keir Starmer is desperately unpopular and yet he wants to suppress why people do hate him. https://www.telegraph.co.uk/politics/2025/09/27/starmer-leas... | | |
| ▲ | iamacyborg 4 hours ago | parent | next [-] | | Linking to the paper colloquially known as the Torygraph to make your point is rather amusing. Seems like we just look at all politicians rather unfavourably right now: https://yougov.co.uk/politics/articles/53907-political-favou... | | |
| ▲ | nephihaha 3 hours ago | parent [-] | | Keir Starmer is deeply unpopular and I have not encountered anyone who likes him (including a lifelong Labour member who has stood for them). Last time I said that about Starmer someone complained there was no link. Now people are complaining about the link. For the record, I have never voted for the Tories ever. A plague on both their houses... There isn't even a cigarette paper between Labour and Tory policies these days — oppress the poor and needy, censor, mismanage everything and enact NGO advice. |
| |
| ▲ | u02sgb 5 hours ago | parent | prev [-] | | The existence of that article surely suggests there is no censorship of the information about his unpopularity? | | |
| ▲ | 3 hours ago | parent | next [-] | | [deleted] | |
| ▲ | nephihaha 3 hours ago | parent | prev [-] | | Please see elsewhere on HN. Lots of threads about the increasing crackdown on free expression online in the UK, and forcing through digital ID online (alongside the EU, Canada, Australia etc). |
|
|
|
|
| ▲ | xg15 4 hours ago | parent | prev | next [-] |
| So then, what should be done instead? |
|
| ▲ | dfxm12 5 hours ago | parent | prev | next [-] |
| The uproar was specifically about the implemented ID checks. KOSA hasn't been passed in any form & its most recent forms introduced to the House & Senate don't include ID checks. To imply that KOSA includes some kind of ID check or that the only way to provide any type of protections is via an ID check is ignorant. |
| |
| ▲ | fc417fc802 4 hours ago | parent | next [-] | | Ignorant? Hardly. It's ignorant to assume anything but the worst from proposed regulation until proven otherwise. Particularly if past proposals from the same people included ID checks. It falls to the people proposing regulation to clearly demonstrate to everyone else that they aren't up to no good. (Spoiler, they usually are up to no good.) | | |
| ▲ | dfxm12 3 hours ago | parent [-] | | It's awfully convenient that you require others to prove or clearly demonstrate things, while you allow yourself to merely assume things :) I know you've already made up your mind, but just humor me. What can the government do to clearly demonstrate to everyone else that they aren't up to no good? | | |
| ▲ | m4nu3l 27 minutes ago | parent [-] | | It's not just about intentions. To convince me that the Government can improve things through regulations, you'd need to do a few things: 1) You must convince me that optimising for some utility function you defined is the right thing to do. 2) You must convince me that the Government can effectively estimate the utility function. 3) Finally, you must convince me that the Government can predict how the utility function will change after the policies are implemented. For 1) I'd have problems with any utility function you could come up with. If you want to maximise total utility, for instance, does it mean that I get to assault someone as long as I gain more utility than the other person loses? What about the "Utility Monster" thought experiment? For 2) and 3), I'm pretty sure the Government has no idea of how to measure and/or predict the result.
Does the scrolling addiction of a teenager cause more loss in utility than the loss of friends to a teenager with disabilities? https://www.theguardian.com/australia-news/2026/feb/06/ive-l... Because of these basic philosophical principles, the burden of proof that some regulation is required is always on the Government side, and the standard of proof should be much higher than it is today. I don't believe that the concept of utility is entirely useless, though.
I believe that by respecting people's individual freedoms and allowing for voluntary arrangements, you'll also get more utility in the long term, whereas if you try to force your utility optimisations, you might, maybe, get utility increases in the short term, but much worse utility in the longer term. |
|
| |
| ▲ | Aurornis 3 hours ago | parent | prev | next [-] | | > The uproar was specifically about the implemented ID checks. I disagree. The uproar was clearly that ID checks were going to be required at all. All of the "Discord alternative" articles were about platforms that didn't require ID checks. > To imply that KOSA includes some kind of ID check or that the only way to provide any type of protections is via an ID check is ignorant. KOSA has specific language about minors and children under 13. How do you think platforms are expected to comply with these requirements without identifying their userbase? This goes right back to the Discord situation last week. | | |
| ▲ | dfxm12 3 hours ago | parent [-] | | KOSA has regulations regarding "users that the covered platform knows is a minor". Nothing in KOSA suggests that a platform has to proactively maintain each user's age or that ID checks have to be used. If you're still curious, Meta has a page talking about how they might determine a user's age, specifically without ID: https://about.fb.com/news/2021/07/age-verification | | |
| ▲ | Aurornis 3 hours ago | parent [-] | | Discord also announced that they would use algorithmic decision making to decide which accounts are old enough to not require ID. This didn't change the uproar at all. If you think KOSA style regulations would allow social networks to avoid ID checks, I don't think you're paying attention. Just read the article we're all commenting on to see how people are willing to attack Facebook for even having internal statistical ideas about problems. If a KOSA style law was passed and Facebook could be shown to have knowledge that some percentage of minors were evading their algorithm, they would be pulled in front of Congress again. There is no way to reasonably look at these laws and think that it would not result in ID check requirements. We don't even have these laws yet and platforms like Discord are already rolling out ID verification. | | |
| ▲ | dfxm12 2 hours ago | parent [-] | | Actions taken unilaterally by private platforms are distinct from government regulation. That is an important distinction that your posts are not addressing. What Meta or another company would decide to do on their own is not "regulation" & is up to them. You haven't addressed the fact that KOSA is calling for regulation for kids online without forcing everyone to go through age verification with anything but your own assertions grounded in nothing in particular & other unrelated topics. | | |
| ▲ | Aurornis an hour ago | parent [-] | | > You haven't addressed the fact that KOSA is calling for regulation for kids online without forcing everyone to go through age verification And you're still ignoring the fact that any regulations targeted at kids online inherently requires that all users' ages are known somehow. You can't have regulations that require companies to do something for kids' accounts without implicitly requiring that they identify which accounts belong to kids. You can't identify which accounts belong to kids without having all accounts verify their age. If this was presented as a "parental controls option" bill I could believe the angle you're trying to go with. However, any regulations that say platforms must do something for kids' accounts will inherently lead to a requirement to verify all accounts | | |
| ▲ | dfxm12 an hour ago | parent [-] | | And you're still ignoring the fact that any regulations targeted at kids online inherently requires that all users' ages are known somehow. No, this has already been addressed. :( |
|
|
|
|
| |
| ▲ | hrimfaxi 5 hours ago | parent | prev [-] | | The full text is here for the interested: https://www.congress.gov/bill/119th-congress/senate-bill/174... |
|
|
| ▲ | mrsssnake 5 hours ago | parent | prev | next [-] |
| The only way to get rid of domestic abusers in your neighbor is to detonate an atomic bomb at the town center. |
|
| ▲ | rkangel 5 hours ago | parent | prev [-] |
| Which step in this logic do you not accept? When profit for a company is in conflict with human good, regulation is needed (e.g. health and safety rules) Facebook causes harm, disproportionately so for younger people Meta is aware of this, but due to a profit motive does not take serious steps to do anything about it (only token efforts) Meta (and other social media) needs regulation |
| |
| ▲ | blululu 5 hours ago | parent | next [-] | | As the sister comment to this makes clear: regulation is needed in this area but that specific bill has a ton of problems. We should rewrite it and remove the more privacy infringing aspects. | |
| ▲ | jacobsimon 5 hours ago | parent | prev | next [-] | | > Facebook causes harm, disproportionately so for younger people I think I disagree with this step. Facebook causes a kind of indirect harm here, and is used willingly by teens and parents, who could simply choose not to use it. That's different from, say, a factory polluting a river with toxic chemicals, which needs government regulation. Basically "negative externalities". | | |
| ▲ | rkangel 5 hours ago | parent | next [-] | | > who could simply choose not to use it There is an inherently addicting aspect to it though - carefully evolved over the years by optimising for "engagement". One (imperfect) analogy is gambling - anyone can in theory choose not to gamble, but for some people addiction gets in the way and they don't make the choice that can be good for them. So (in the UK) the gambling industry is regulated in terms of how it advertises and what it needs to provide in terms of helping people stop. I don't know if this particular regulation is in anyway effective, but I do think that some regulation is appropriate. | | |
| ▲ | jacobsimon 4 hours ago | parent | next [-] | | Yeah that’s a good counterpoint. I guess it hinges on whether you can define a clear boundary around what is harmful or unharmful social media. Like to me “online shopping addiction” is probably a more realistic and analogous problem to gambling, so maybe online advertising to teens could be regulated, but the jump to child abuse is so far outside Meta’s actual business model that it feels over-reaching to go there. | |
| ▲ | xg15 4 hours ago | parent | prev [-] | | I like how everyone on this thread is up in arms about Zuckerberg - until the moment where regulation is mentioned. Then it's suddenly "oh well, they could just, like, not use it, couldn't they?" There is also peer pressure/FOMO. "Choosing not to use it" is not exactly easy if everyone else in your social group uses it - especially for teens. | | |
| ▲ | jacobsimon 4 hours ago | parent [-] | | I’m not saying it’s easy for teens to stop using social media, I’m just saying it doesn’t seem like it should require intervention by the US government to do so. There are many other ways to go about social change. | | |
| ▲ | xg15 4 hours ago | parent [-] | | Which would be? The harmful effects of social media are a topic of public discussion for at least a decade now, if not more. I think if there were an effective grassroots/civil society way to address this, it would have been found by now. | | |
| ▲ | jacobsimon 3 hours ago | parent [-] | | 1. Parental control features on phones and computers 2. Grassroots marketing about potential risks of social media 3. Maybe better parental consent via existing regulations like COPPA |
|
|
|
| |
| ▲ | xg15 4 hours ago | parent | prev [-] | | From the article, which quotes an internal study of Facebook itself on this: > An internal 2019 study titled “Teen Mental Health: Creatures of Habit” found the following: - “Teens can’t switch off Instagram even if they want to.” - “Teens talk of Instagram in terms of an ‘addicts narrative’ spending too much time indulging in compulsive behavior that they know is negative but feel powerless to resist.” - “The pressure ‘to be present and perfect’ is a defining characteristic of the anxiety teens face around Instagram. This restricts both their ability to be emotionally honest and also to create space for themselves to switch off.” | | |
| |
| ▲ | Aurornis 3 hours ago | parent | prev | next [-] | | > Facebook causes harm, disproportionately so for younger people > Meta (and other social media) needs regulation The first obvious flaw in your logic is that you jumped from "Facebook causes harm" to "other social media needs regulation". It should be obvious why that's broken logic. The second problem is that this is just the classic "think of the children" fallacy: You point out a problem, say it affects children, and then use that to shut down any debate about regulation. It creates a wide open door for intrusive regulation. This isn't new. It's been going on for decades. Yet people still walk right into this trap over and over again. So to answer your question: > Which step in this logic do you not accept? The step I don't accept is the real core of the problem: The specifics of the regulation, but you conveniently stopped your logic chain before getting to that. | |
| ▲ | mrsssnake 5 hours ago | parent | prev [-] | | Some regulation yes, throwing information agnostic universal global packet switching network in the trash bin is not the way. |
|