Remix.run Logo
Jury finds Meta liable in case over child sexual exploitation on its platforms(cnn.com)
255 points by billfor a day ago | 398 comments
nclin_ 4 hours ago | parent | next [-]

375 million awarded at $5000 per child harmed. Implying that only 75,000 children were harmed.

Got away with it again, good profit, will repeat.

riazrizvi 3 hours ago | parent | next [-]

That's not how the legal framework in society works. Victims are compensated. The business pays. The precedent of wrongdoing is specifically established which means that further infringements can be quickly resolved.

The legal system does not seek to destroy the business, or individual criminal. Instead it wants them to be able to continue doing their other non-criminal stuff.

munk-a 3 hours ago | parent | next [-]

The legal system has two goals - to compensate individuals harmed and to discourage further violations of the law. This lawsuit seems to have fulfilled the first goal but fell flat on its face when it comes to punitive damages.

riazrizvi 3 hours ago | parent [-]

I think there's an axis of perceived wrongdoing here, and you and I fall on different points. Yours is more extreme, you say Meta was doing broad harm by exploring this activity, and want to see greater damages to scare other businesses off from the general territory of addictive interfaces. Mine is where we want businesses to continue to explore and develop 'sticky', compelling, user experiences but Meta went too deep in some specific ways.

EDIT: I see I'm mixing up the New Mexico case yesterday on sexploitation with the addiction case in Los Angeles I thought we were talking about here.

munk-a 2 hours ago | parent | next [-]

To start off with my personal beliefs... I agree - I see a much broader harm in how platforms try and make themselves addictive as I've worked on such systems in the past. I think the public and even most technical folks that aren't deep into engagement metrics underestimate how studied the field has been and how many iterations of approaches to daily engagement reminders, friction removal and FOMO have been worked through to get to the point we're at today. In my opinion, which absolutely isn't fact, this work is broadly unproductive at improving our daily lives - I can understand that there are some compelling counter arguments that these developments can be harnessed for good but I don't share them.

But, specific to this article and ignoring my personal beliefs - I still find this judgement to be severely lacking. I don't think this judgement is nearly noticeable enough to Meta to actually provide a significant impact on the way they do business outside of tidying up some specifically egregious corners and making sure they internally communicate moving forward in a way that appears to comply with the judgement. The judgement was enough when applied to this pool of users to make these specific users unprofitable in retrospect (e.g. Meta would have more money if it had refused to even do business with these users) but I'm also concerned that the pool of considered victims was so narrow that it excluded a significant number of similarly harmed victims and that the amortized damages end up being negligible.

riazrizvi 2 hours ago | parent [-]

I guess we have deep deep divisions on what everyone is doing in society, and what makes a 'good' society.

As I've aged, I've entered new-to-me territory where a good society needs to reflect the world as it is, so that its members have high survivability.

At the local family level for instance. When my kids were young. I had dreams of being super financially successful so that I could give them lots of nice things. I just don't want that for them anymore. Protection, and pandering, does not make a good lineage IMO. It's something of a leap I'm asking of you to connect this to my position here on Meta, but I've got other work to do, and I hope it's enough to convey my point.

gusgus01 3 hours ago | parent | prev | next [-]

This was about Meta's platforms not doing enough to protect children from sexual material (and allegedly ignoring employee warnings and lying to the public about it), not intrinsically their addictive interface and compelling user experience. I suppose the actions necessary to protect children from exposure to sexual material/exploitation could limit their ability to make certain changes to their platform, eg tighter moderation would reduce the amount of content that could be uploaded, but they could also have just not allowed children on the platform (like how Facebook started) and then not worried about child exploration?

ethanwillis 2 hours ago | parent | prev [-]

in what specific ways did it go too deep? it's hard to understand when you're being so vague.

lp4v4n 3 hours ago | parent | prev | next [-]

It's very hard to think they wouldn't do something harmful to children again if the economic incentives aligned. For corporations it's just so easy to say sorry, and in the worst case they know an irrelevant fine will be placed in order not "to destroy the business".

blks an hour ago | parent | prev | next [-]

8 Xboxes is a pretty small compensation for a sexual abuse case.

nclin_ 3 hours ago | parent | prev | next [-]

The function of a system is what it does.

Meta knowingly hurt children for profit. It worked.

If we are in any way serious about technocratic solutions to social problems, this would be untenable, the company would be bankrupted, a new company would fill its place. No tears would be cried, nothing of value would be lost, half of hacker news would be chafing at the bit to build a better alternative for the newly opened market.

But that's not what happened. We allowed children to be knowingly hurt for profit.

The system is functioning as intended.

gustavus an hour ago | parent | next [-]

https://www.astralcodexten.com/p/come-on-obviously-the-purpo...

riazrizvi 3 hours ago | parent | prev [-]

Not hurting children is a pretty popular idea. So why don't you make that technical product for children based on this foundation, and blow Meta out of the water? I love your conviction. Good luck.

nclin_ 2 hours ago | parent | next [-]

Not taking this as good faith, if you're devolving into sarcasm I assume you have no insight to offer.

riazrizvi 2 hours ago | parent [-]

It's not sarcasm. I'm channeling you to a more productive focus for what I see as reaching beliefs/hopes. Try and make them happen instead of trying to convince other people they should happen. It will either temper/align them to the world as it is, or show the world what it can be.

nclin_ an hour ago | parent [-]

Ok, well let's think about this with the same framework I'm trying to bring to the discussion above: system dynamics.

Your comment has the effect of being flippant, condescending, and seemingly callous to the subject matter. When called out, you have backed up to an alternative explanation which is, again, massively condescending (I don't need channeling mate, certainly not from you).

You have not engaged with the content in a good faith manner.

So, standing back and looking at your comment in terms of its effects rather than what it claims to be its effects (AND the effect that making those secondary claims have - doubling down on condescension), it looks more like you're trying to bully me into changing my behaviour and viewpoint without meaningfully engaging with the content.

Ironically, I'm feeling psychological reactance, so your comments polarized me against you (see the Backfire Effect) and deepened my convictions.

I won't engage with bullies any further but to call them out, I'm hesitant to bring the conversation down to this level and give you any kind of air to begin with, but I think it's important to analyze discourse as it happens.

canelonesdeverd 2 hours ago | parent | prev [-]

Whatever point you're trying to make I hope you realize it's not a good look to phrase it like that.

roysting 2 hours ago | parent | prev | next [-]

That’s just sophist gaslighting. If an individual perpetrated some act of sexual exploitation of minors, or even only facilitating it, would not simply pay $5000 fine per child whose life has been ruined and then they just continue on “doing their other non-criminal stuff”.

Stop trying to gaslight people and think about what you are defending and making excuses for, instead of basically being a conspirator facilitating these vile acts through excusing effectively no consequences. If your daughter was sexually exploited, do you think $5000 would be adequate compensation? Possibly even without covering therapy?

I am not sure about the particulars of this case and I think parents are also largely responsible just like any other criminal negligence case, but that is no excuse to simply let corporations who after all we are told are people, be some kind of superior, special people who are not punished to any even moderately consequential degree as actual, real people. Are they people or not? So they get to commit crimes but also not have actual, real consequences? Just stop and think about what a bunch of nonsense you are promoting.

We actually need a punitive system similar to the individual punishments. That would maybe look like a seizure of a percentage of the company similar to the percentage of one’s life one would spend in prison for a similar act. Yes, it would be a lot if it were, e.g., a 1/3 of the ownership of Facebook (which is easily done by forced issuance of shares), but that would also be the incentive to make sure that you, Facebook, are not facilitating child sexual exploitation.

The current problem with all of our systems is that there are only perverse consequences where the perpetrators of evil benefit and profit from the evil, while everyone else pays the cost. That needs to be flipped.

notnullorvoid 3 hours ago | parent | prev [-]

They have enough lawyers that they can easily find another criminal avenue that doesn't step on the previous path.

riazrizvi 3 hours ago | parent [-]

Your opinion isn't particularly important in our legal system. Since your comment expresses a preconceived notion of the accused's guilt. It would disqualify you from a jury, and undermine your legitimacy in a judicial, defensive and even prosecutorial function.

Though I respect it as a human opinion.

lithocarpus 4 hours ago | parent | prev | next [-]

This represents 0.6% of meta's 2025 profits, or 0.2% of revenue. Though presumably it was based on harms from previous years, I haven't read the lawsuit.

petcat 4 hours ago | parent | prev [-]

Well hopefully now that there's precedent, it will open them up to recurring repeat-offender lawsuits and legal action. The goal is to get them to stop doing predatory things now.

dwedge 11 hours ago | parent | prev | next [-]

Maybe I'm just getting old and cynical but, while I think current social media is bad for children, I'm very suspicious of the current international agreement that it's time to take action, especially with all the ID verification coming from multiple avenues

MildlySerious 11 hours ago | parent | next [-]

Two things can be true, and I am in the same boat. Should the next generation have their brains fried by ad-tech corporations and their algorithms? Absolutely not. Should the overdue off-ramp from this trend be the on-ramp to mass-surveillance and government overreach? Also a firm no.

benrutter 11 hours ago | parent | next [-]

I really wish this take was more prominent. I really don't buy that mass-surveillance should be required for age verification. There are plenty of very smart people who have created much more complicated things than a digital age verification that doesn't track every time you use it.

This also isn't helpful, but I think the sudden push of urgency isn't helping. The internet has existed without any kind of age verification or safety measures for about 30 years. We could have used that time to have a sensible conversation about policy trade offs, but instead we've waited till now to decide that everything has to be rushed through with minimal consideration.

OkayPhysicist 6 hours ago | parent | next [-]

You don't even need to go all high-tech with it: Children, by nature of being children, aren't going out and buying their own smartphones and computers. When Mom and Dad buy the device for their kid, just punch in the kid's age before handing it to them.

That's the flow that California's age verification system uses. Personally, I'm opposed to any age verification beyond the current "pinky promise you're 18" type deals, but California's is the least intrinsically offensive to me.

autoexec 4 hours ago | parent [-]

> When Mom and Dad buy the device for their kid, just punch in the kid's age before handing it to them.

Doing this doesn't accomplish anything in terms of protecting children from the harms of the internet. In fact it feeds your child's age to marketers and child predators.

Every website will get to decide how to handle the age data our devices will now be supplying them. In the case of facebook, it's not as if they had no idea the children endlessly posting selfies and posting "six seven" on their service weren't adults. Facebook was 100% aware that the children using their service were children. They knew what schools those kids went to, who their parents were, which other kids they hung out with. Facebook knew they were children and they took advantage of that fact.

The law California (and other states) passed doesn't define what content has to be blocked for which ages and doesn't give parents any ability to decide what content their children should or shouldn't be allowed to see. It takes control away from parents. As a parent, I might think that my 16 year old should be allowed to look up information on STDs but the websites that collect my child's age could decide they can't and I'll have no say in it.

jacobgkau 3 hours ago | parent [-]

> The law California (and other states) passed doesn't define what content has to be blocked for which ages

No, but it's a framework that would allow other laws to do so. Because...

> it's not as if they had no idea the children endlessly posting selfies and posting "six seven" on their service weren't adults.

...you can make statements like that which sound like common sense, but it would be incredibly hard to regulate based on "if you know, you know" (or "you should have known"/"you had to have known"). The law has to provide (guarantee) a way for them to know in order to actually require them to take action based on it.

> As a parent, I might think that my 16 year old should be allowed to look up information on STDs but the websites that collect my child's age could decide they can't

This is a different problem. It sounds like you're essentially wanting to guarantee access to certain things, not just for your own 16-year-old, but for everyone else's, too (because if it was just yours, you could look it up for/with them if necessary). It'd be difficult to compel businesses to provide services to audiences they don't want to. But again, that's a separate problem that doesn't necessarily conflict with the rest of the system.

jt2190 10 hours ago | parent | prev | next [-]

> We could have used that time to have a sensible conversation about policy trade offs [of age verification]…

There is always a conversation, but it is often not the popular one and gets drown out by whatever everyone is excited about at the moment. You can find it if you seek it out.

Lawrence Lessig’s book “Code” (1999), for example, talks about how a completely unrelated internet is an anomaly, and that regulation will certainly be necessary, and advocates that it be done in a thoughtful manner.

pixl97 9 hours ago | parent | prev | next [-]

>used that time to have a sensible conversation about policy trade offs,

On HN itself, no way. Too many people here make far too much money on ads to want that. It seems the other part that want freedom also want so much freedom it gives huge corporations the freedom to crush them.

>things than a digital age verification that doesn't track every time you use it.

The big companies that pay the politicians don't want that, therefore we won't get that.

intended 2 hours ago | parent [-]

> On HN itself, no way. Too many people here make far too much money on ads to want that.

Ya know, this might explain why the warnings seem to fall on deaf ears here.

New favorite person on the internet.

ball_of_lint 6 hours ago | parent | prev | next [-]

It's not about doesn't - the government can always claim that it doesn't track you. That is unlikely to stay true.

It's really either they can't track you or they will track you.

jimbokun 9 hours ago | parent | prev [-]

Best time to plant a tree: 30 years ago.

Second best time to plant a tree: now.

gopher_space an hour ago | parent | prev | next [-]

A Kindernet would solve many problems. Hardware-gated access, local moderation and control, zero commerce or copyright, whatever you want to do to make the environment uninteresting to bad actors. Frame opposition to the concept as demand for access to your children.

svachalek 7 hours ago | parent | prev | next [-]

Exactly. There's a clear alternative in my mind, one I'm sure is objectionable in its own way but I think is the least evil of the three: require providers to label their content and make them liable for it. This allows parents to do the censoring, which is functionally impossible now because no parent can fight the slippery power of multibillion dollar software investments designed to prevent them from having control over what their kids see.

ed_blackburn 10 hours ago | parent | prev | next [-]

Absolutely: I said something similar recently: https://news.ycombinator.com/item?id=46766649

jimbokun 9 hours ago | parent | prev | next [-]

So you're saying these corporations are responsible for verifying the age of their users without verifying the age of their users?

MildlySerious 2 hours ago | parent [-]

I'm not saying that, nor did I allude to it in any way. I made no assertions as to what the solution should be.

The ideal scenario would be everyone choosing not to engage with these predatory platforms. Going from there, the right question to me is what steps we have to take as a society for that to become even remotely realistic and, subsequently, what role governments can or should have in that.

For starters, I would be in favor of fines that actually hurt the bottom line instead of this "cost of business" bullshit. We have handed these corporations unprecedented access to and control over our lives, to the point that they erode democracy and the social fabric itself. The inevitable abuse of that power when it comes with barely any strings attached needs to be punished in a way that makes it unattractive as a business model at the very least.

Instead of lowering the attack surface by locking out kids, and in turn introducing mass surveillance which at best also lends itself to abuse, the root issues of ruinous greed and lack of accountability need to be addressed. The whole concept that there is no price too high for profits needs to burn. Social media is just one of the more recent manifestations of it.

Forgeties79 9 hours ago | parent | prev [-]

They’re the oil barons of our day. They frack our data and output psychological/social pollution.

b00ty4breakfast 10 hours ago | parent | prev | next [-]

That's because we should be regulating the social media industry rather than regulating social media users.

Unfortunately, social media users don't have billions of dollars to spend on lobbying and related activities around the world.

Aurornis 9 hours ago | parent | next [-]

> That's because we should be regulating the social media industry rather than regulating social media users.

These lawsuits and regulations are against the industry, not the users.

The regulations and lawsuits are driving the pressure to ID check users and remove end-to-end encryption.

jimbokun 9 hours ago | parent | prev [-]

The ask is to treat users differently based on age. How can they do that without verifying their users age?

autoexec 4 hours ago | parent | next [-]

You honestly think facebook has no idea that the children using their website are children? The combination of the children's selfies, social network, GPS coordinates, and posts make it very clear. Facebook already knows who the children are and they've been explicitly targeting them accordingly.

bryan_w 3 hours ago | parent [-]

You want people to be kicked off the internet because they have a baby face? You think the law should mandate the use of an imperfect facial recognition system?

autoexec 3 hours ago | parent [-]

I think that facebook has been using facial recognition on every photo uploaded to their platform for a very very long time and that they already use that data in part to determine the age of users. Facebook hasn't been kicking anyone off the internet because of that data so far. Instead facebook just targeted the users they decided were children as children.

Forcing the users to verify their age changes nothing. It gives the illusion of "doing something" but it just gives facebook data they already had. What's still needed is regulating social media platforms themselves to place explicit limits on what they can do to hurt their users, including children.

b00ty4breakfast 9 hours ago | parent | prev [-]

we should be removing the harmful aspects of modern social, which are harmful for everyone not just minors, by making them unprofitable or even outright illegal.

Instead we are saying "only adults should use this" which, while technically regulating the industry, places the restriction on users.

We're treating it like tobacco or alcohol (2 industries who have similarly spent millions upon millions of dollars in lobbying efforts) but we should be treating it like asbestos.

jimbokun 8 hours ago | parent [-]

OK, so what would be in the text of this law making it enforceable and not easily game-able by the social media companies and without severe unintended consequences?

dminik 8 hours ago | parent [-]

Why are you asking lawmaker questions of people on HN? What kind of answer are you expecting?

Just because I don't know how to write a law that can prevent it doesn't mean that I can't recognize an actual issue when I see it.

jimbokun 3 hours ago | parent [-]

Because people like you then go and vote for politicians without actually understanding what they are proposing.

It's all Trump style "believe me I know how to fix it" and you will vote for the person that pushes your buttons regardless of whether they have a plausible solution or not.

crystal_revenge 2 hours ago | parent | prev | next [-]

> I'm very suspicious of the current international agreement that it's time to take action

Especially since, when you look at the behavior of younger people, they're way more careful about social media than millennials were. My teenage child an their friends keep all of their conversations in a massive but private group chat. Any social media consumed by them, is basically 'read only'. They don't post online, none of them of have social media accounts where they post pictures of themselves etc.

Same with all of my younger gen-z coworkers. If they have socials the post very selectively and all content is work friendly.

The people I see that need "protection" are aging millenials that don't really understand how wildly they're exposing themselves and families. I cringe when I see the amount of personal photos and information shared by the view millenials I know who still need their ego-boost from these platforms (and that number itself is much smaller).

Younger people don't share their opinion and anything resembling private photos online any more.

afavour 6 hours ago | parent | prev | next [-]

Meta spent $2bn lobbying for this ID verification stuff:

https://news.ycombinator.com/item?id=47361235

b65e8bee43c2ed0 11 hours ago | parent | prev | next [-]

given that it's happening simultaneously with the war on E2EE and general purpose computing, their goals are as transparent as it gets. the West is at this point only a decade behind China.

barbazoo 5 hours ago | parent | prev | next [-]

There's no agreement other than maybe that social media is bad for children. To get kids off of there you need to identify who's a kid and who isn't. Same with alcohol and tobacco. Obviously people shouldn't give their ID to Meta and hopefully many will not but those that do, for me, as someone who doesn't use social media, that's a small price to pay to keep kids off. Again, Meta is completely optional, it's a platform to share stupid videos, no one NEEDS to be there.

raincole 11 hours ago | parent | prev | next [-]

Governments always want censorship and speech control. That never changes. The only difference is that now the general populace has accumulated enough disgruntlement to social media to be used against themselves.

gmerc 11 hours ago | parent [-]

No the difference is that when governments are still constrained by the rule of law it’s cheap PR to fight the government on data access claims but once they are authoritarian fascist industrialists fall over themselves to feed everything into Palantir

Aurornis 9 hours ago | parent | prev | next [-]

I’m deeply worried by how uncritical these responses are. Meta is removing end-to-end encryption specifically because these lawsuits are trying to claim end-to-end encryption is a tool for child abuse.

The “think of the children” angle is the perfect angle to pressure companies to make communications readable by the government. And here tech audiences are welcoming it and applauding because they couldn’t read past the headline and they think anything that hurts Zuck is good.

How anyone can see this happening and not draw the connections to Discord and other services also pushing ID checks is beyond me. Believing that this will only apply to services that don’t effect you is short sighted.

lionkor 11 hours ago | parent | prev | next [-]

A lot of the ID verification stuff is coming FROM those companies

boysenberry 10 hours ago | parent [-]

I’ve just been stung by iOS 26.4’s implementation of the age-gate. My only option has been to rollback with a 26.3.1 IPSW.

I unlurked and made a thread last night, but I think it might be hidden due to account age: https://news.ycombinator.com/item?id=47511919

Ajedi32 9 hours ago | parent [-]

Yep, your post and this comment were hidden. I vouched for them so they're visible now. Good luck!

gostsamo 11 hours ago | parent | prev | next [-]

because it is a false dilemma

intended 11 hours ago | parent | prev | next [-]

Meta is lobbying to push age verification to the OS level.

I have read the OSINT report from Reddit. The data it has is being interpreted as Meta orchestrating a global lobbying scheme.

However the data is equally if not more supportive of Meta simply taking advantage of global political sentiment to position itself better.

I’ve mentioned this elsewhere, but the HN zeitgeist seems to be resistant to the idea that tech is the “bad guy” today.

I work in trust and safety, and have near front row seats to all the insanity playing out today.

bryan_w 3 hours ago | parent [-]

Do you think Meta wouldn't want to be legally mandated to ask for your id? The improvement to ad targeting alone would be enough to pay for any lost users. They would probably want nothing more than to be in the same business as idema and the other online identity/age verification providers are.

Critically think about this for a second before believing some ChatGPT generated "OSINT" report on reddit. Otherwise, you'll allow corpos to use your mob hatered against you

intended 3 hours ago | parent [-]

I think that report has multiple issues, but it’s currently popular and people are fond of blaming meta.

Even your point - meta is not after mandated IDs, but they see the way public opinion is moving and are using it to their tactical advantage. They are lobbying to push the regulatory burden on app stores and operating systems.

expedition32 11 hours ago | parent | prev | next [-]

Tech bros deliberately made digital crack for kids and corporations refuse to moderate online content.

There is no conspiracy the general public is faced with a crisis and they are desperate for a solution.

The teen suicide statistics do not lie.

Manuel_D 8 hours ago | parent | next [-]

> The teen suicide statistics do not lie.

Teen suicide rates in the US are lower now than they were in the 1990s.

claaams 8 hours ago | parent | next [-]

This doesn’t paint the entire picture. Suicide rates peaked in 1990 and then declined to its lowest point in 2007 from there the rates started rising again.

Manuel_D 8 hours ago | parent [-]

Like all metrics, they fluctuate over time. But they've remained pretty for decades stable at around 10 per 100k per year. The recent rise doesn't really coincide with social media adoption. By 2008, >80% of teens were using social media. If social media adoption was driving the increase in suicides, we would have started to see a rise in suicides around the early 2000s, reaching it's peak around 2008. But that adoption of social media by teens was coupled with a decrease in suicides. The more recent rise in teen suicides occurred during a period of largely flat teen social media adoption (because nearly 100% of them were already on social media by the end of the 2000s).

This idea of teen suicide painting a clear picture about the impact of social media just isn't borne out by the data. And lastly, people ought to remember that teens have the lowest rate of suicide among any age cohort.

johnmaguire 4 hours ago | parent [-]

> If social media adoption was driving the increase in suicides, we would have started to see a rise in suicides around the early 2000s, reaching it's peak around 2008.

I think there is a logical fallacy here. Social media has not remained stable since 2008. For one thing, 2008 social media used the chronological timeline. For another, it didn't show "recommended" (or sponsored) content in your feed. There was no TikTok. Facebook was relatively new and MySpace was not even really feed-based as I recall.

Manuel_D 3 hours ago | parent [-]

Facebook moved away from chronological timelines as default in 2011. YouTube added "recommended" videos tab in 2007.

expedition32 4 hours ago | parent | prev [-]

The world is bigger than the US.

Anyway you can go on HN and deny there is a problem but you will lose public opinion and crucially the voting booth.

Manuel_D 3 hours ago | parent [-]

The fine was levied in a US court.

dwedge 10 hours ago | parent | prev [-]

The general public is being told they are faced with a crisis. This has been a problem for at least a decade, yet suddenly it's at the forefront and conveniently ties into ID verification for everyone to use general purpose computing.

I'm sorry but if you don't think there's a conspiracy I have a bridge to sell you. It was already unveiled that Meta has lobbied billions towards promoting this legislative change

jimbokun 9 hours ago | parent | next [-]

You're arguing there's a conspiracy, but even if there is, what is the best action for governments to take given the devastating impact social media has been demonstrated to have on young people especially?

dwedge 8 hours ago | parent [-]

I don’t know what the solution is, but introducing mass surveillance of ALL users on their own devices hurts the general population - do you think it will solve the problem?

kgwxd 9 hours ago | parent | prev | next [-]

> The general public is being told they are faced with a crisis.

> This has been a problem for at least a decade.

I get you're point, but anyone that doesn't is asking "Which is it?"

I think everyone can see there is problems. Is there a crisis? I don't think so. Same problems we've always had, but on a computer.

People that know tech, know these laws cross a MAJOR line. Not a little slippery slope thing, this is off a cliff. But I don't think most people, that are already used to having to sign in with an online account on every device they use, even their TV, see it as that big a step. They don't even realize how predatory it is that they are required to sign in. What they need to see is that the sign in requirement was a choice by the vendor. These are LAWS, demanding no one ever be given the choice to not reveal personal information about themselves to use ANY computer. That's the point that needs to be driven home.

intended 8 hours ago | parent | prev [-]

Oh hell no!

Its been decades of work to even get social media to court.

No one wants to talk about this or look at the issues when it’s not sexy.

$@&$$ - I’ve been at conferences and had safety teams cry on my shoulder about how THEY don’t get engineering resources if they ask for it.

Tech platforms suppress so much research and hold so much data hostage, that an entire research coalition based on independence from tech.

Zuck and tech as a whole pivoted to drop safety investments the moment this government came to power.

And this is for user in frikking America !

The shit that is going down in the rest of the world is a curse. The sheer amount of NCII that exists, with zero recourse for people whose lives are destroyed is insane.

dminik 8 hours ago | parent [-]

> Zuck and tech as a whole pivoted to drop safety investments the moment this government came to power.

I think the question to ask here is, if both Meta and the current administration don't care about child safety, why is the age verification stuff going so smoothly? Is helping them do this really the right move?

intended 3 hours ago | parent [-]

Well it’s not going smoothly. People on HN are talking about it now, but they are really talking about privacy.

For the rest of the world this has been brewing for more than a decade.

Australia was the actually the one to tip the first domino. This is just a US state verdict on willful harm by a firm. Its not even about age verification.

For meta, shifting regulatory burdens to OS / app stores, reduces regulatory burden.

For governments, part of it is actually trying to come to grips with an impossible safety imperative and another part of it is happy to gain more control and power.

The power grab needs to be curtailed, and the people actually trying to help kids need better technical solutions.

kgwxd 9 hours ago | parent | prev [-]

Really? You still think you're the one looking at it all wrong? It's exactly what you think it is. Stop giving blatant malice the benefit of the doubt, especially the doubt they've directly instilled.

Aurornis a day ago | parent | prev | next [-]

Many will cheer for any case that hurts Meta without reading the details, but we should be aware that these cases are one of the key reasons why companies are backtracking from features like end-to-end encryption:

> The New Mexico case also raised concerns that allowing teens to use end-to-end encryption on Instagram chats — a privacy measure that blocks anyone other than sender and receiver from viewing a conversation — could make it harder for law enforcement to catch predators. Midway through trial, Meta said it would stop supporting end-to-end-encrypted messaging on Instagram later this year.

The New York case has explicitly gone after their support of end-to-end encryption as a target: https://www.reuters.com/legal/government/meta-executive-warn...

mjevans a day ago | parent | next [-]

The correct nuance here is...

* Classifying accounts as child accounts (moderated by a parent)

* Allowing account moderators to review content in the account that is moderated (including assigning other moderation tools of choice)

In call cases transparency and enabling consumer choice should be the core focus.

Additionally: by default treat everyone online as an adult. Parents that allow their kids online like that without supervision / some setting that the user agent is operated by a child intend to allow their children to interact with strangers. This tends to work out better in more controlled and limited circumstances where the adults involved have the resources to provide suitable supervision.

At the same time, any requirements should apply only to commercial products. Community (gratis / not for profit) efforts presumably reflect the needs of a given community.

kelseyfrog a day ago | parent [-]

> Classifying accounts as child accounts

It's ok to drive Dad's truck unless he catches you and tells you no.

IAmBroom 10 hours ago | parent [-]

Unfair presentation. What they suggested was more akin to, "Assume someone with keys is an adult, and let them start the truck."

Dad should either know his children would never drive the truck without permission, or keep his keys as safe as his wallet (and if he can't trust his kids with keys, you bet his wallet needs protection).

pylua a day ago | parent | prev | next [-]

I’m actually okay with not letting under age people use e2e. I’m not okay with blocking everyone. I have 2 kids.

fourside a day ago | parent | next [-]

I understand the concern but then to make this available for adults you now have to provide proof of age to companies, which opens up another can of privacy worms.

skybrian a day ago | parent [-]

Theoretically we don't actually need proof of age. Websites need to know when the user is attempting to create an account or log in from a child-locked device. Parents need to make sure their kids only have child-locked devices. Vendors need to make sure they don't sell unlocked devices to kids.

polyomino a day ago | parent [-]

Children do not want child locked devices and they will find alternatives

sixsevenrot 11 hours ago | parent | next [-]

As with smoking, alcohol, sex, drugs etc

Children who are smart enough to get access to a given vice without getting caught are more likely to be mature enough to be able to cope with that vice.

cr125rider 5 hours ago | parent [-]

I think we’re going to see how that plays out with gambling.

It seems a bit silly to think security abstinence is the solution.

skybrian 21 hours ago | parent | prev | next [-]

True, it's never going to be 100%, but at least it's a tractable problem for parents. Enough to change what the culture considers "normal," anyway.

IAmBroom 9 hours ago | parent | prev | next [-]

Imperfect solutions are still called "solutions".

kakacik 5 hours ago | parent | prev [-]

Well then don't give them money to do so, its not like phones grow on trees. If you make selling phone/internet device to a minor under certain threshold an illegal act severely punished by law in same way alcohol and cigarettes are, many cases of access are solved. Also, paid internet subscription doesn't grow on the trees even though there are free wifi networks.

All imperfect solutions, but they slice original huge problem into much smaller chunks which are easier to tackle with next approach.

whatshisface a day ago | parent | prev | next [-]

I'm not comfortable with the idea that children's private messages would be exposed to thousands of social media workers and government employees.

newscracker 19 hours ago | parent | prev | next [-]

In a way, this is like saying that one trusts total strangers in some random large tech company and total strangers in government agencies to read and/or manipulate conversations that kids have. This also paves the way to disallow E2EE for other classes of people based on arbitrary criteria. I don’t believe this is good for society overall.

intended 16 hours ago | parent [-]

The reason we are having this discussion, is because the private route worked up to a point.

Firms have a fiduciary duty to shareholders and profit.

On the other hand, You ultimately decide the rules and goals that operate government organizations, and do not have a profit maximization target.

They aren’t the same tool, and they work for different situations.

The E2EE slippery slope is a different challenge, and for that I have no thoughts

triceratops a day ago | parent | prev | next [-]

I have kids. I don't want creeps and predators spying on their conversations with friends.

pylua a day ago | parent | next [-]

That's true, I didn't consider that

jMyles a day ago | parent | prev [-]

https://web.archive.org/web/20210522003136/https://blog.nucy...

noosphr a day ago | parent | prev | next [-]

You just need to provide the government with your name and address and the name and address of the counter party every time you send an encrypted message.

If you don't support this you're obviously a pedo nazi terrorist.

hsbauauvhabzb a day ago | parent | prev | next [-]

The problem is all these ‘for the children’ arguments contain collateral damage.

vaylian 15 hours ago | parent | next [-]

And the effectiveness for the stated goal is also often questionable.

pylua a day ago | parent | prev | next [-]

It does seem like it could potentially be used to enforce mass surveillance over the people of the United States

simmerup a day ago | parent [-]

Alphabet can grep your emails, Amazon has literal microphones and cameras in most peoples houses

That ship has sailed

pylua 21 hours ago | parent [-]

Yes google analyzes everything you upload to it and if it finds a violation will report to the proper gov agencies.

It is actually terrifying . If you write something out of context or upload an image out of context you can be in big trouble.

intended 16 hours ago | parent | prev [-]

Well, the problem is that the “don’t do it” arguments have children as the collateral damage.

We are at a point where we are picking and choosing collateral damage targets.

usr1106 15 hours ago | parent | prev [-]

There is no reason kids should use so called smart devices, except making certain companies richer. Kids have had a healthy development without such crap for thousands of years. We don't discuss what percentage of alcohol should be allowed in beer and wine for kids.

IAmBroom 9 hours ago | parent [-]

The French (watered wine) and British (shandies) do.

lrvick 4 hours ago | parent | prev | next [-]

Centralized organizations with proprietary software can never offer meaningful end to end encryption because they can just ship an app update to disable or backdoor it at any time.

It is better for them to be forced to turn off the security theater so people that need actual privacy can research alternatives.

ronsor 20 hours ago | parent | prev | next [-]

This is the core issue.

We know that this isn't really going to reduce harm for children, we know Meta is not seriously going to suffer or change, and we know this is going to be used as a cudgel to beat down privacy and increase surveillance.

armada651 14 hours ago | parent [-]

Why is it so important that kids have access to the internet anyway that we're willing to sacrifice both our privacy and freedom of speech rights for it when we already know it's damaging their mental health?

We don't need all this privacy invasion if we just didn't give kids a smartphone with a data plan.

bitwize a day ago | parent | prev | next [-]

The Clipper chip is coming back.

intended 16 hours ago | parent | prev | next [-]

Rock meet hard place?

Harm to kids is actually happening, and this is always going to be a hot button topic.

E2E is critical for our current ability to communicate online, but will be a lower priority when pitted against child safety.

Fighting the good fight is one thing, fighting for the sake of it, without a plan that addresses the tactical reality is another altogether.

Personally, I think E2E will be defended, but it’s becoming a lightning rod for attention. As if removing encryption will solve the emerging issues.

I suspect providing alternatives to champion, such as privacy preserving ways to verify age, will force a conversation on why E2E needs to go.

bdangubic 19 hours ago | parent | prev | next [-]

This is a good thing for “social” media. If you use any social media app (especially those owned by Meta) you should assume that absolutely everything you do is for full public consumption. Maybe these changes will make everyone stop thinking that anything is private when using “social” media apps.

themafia a day ago | parent | prev | next [-]

> Many will cheer for any case that hurts Meta

Absolutely. Particularly where they've been found to be guilty.

> but we should be aware that these cases are one of the key reasons why companies are backtracking from features like end-to-end encryption

Why _social media_ companies are backtracking. I'm extremely nonplussed by this outcome.

> concerns that allowing teens

Yes, because that's what we all had in mind when considering the victims and perpetrators of these crimes.

gzread a day ago | parent | prev [-]

Is it illegal or is it just illegal on general purpose platforms whose focus isn't extreme security?

We all know Meta can still read E2EE chats (otherwise they wouldn't do it) and they're using E2EE as an excuse to avoid liability for the things their platform encourages. Contrast this with something like Signal where the entire point is to be secure.

cristoperb a day ago | parent | next [-]

> We all know Meta can still read E2EE chats

That can't be true, otherwise in what sense is it E2EE?

duskdozer 15 hours ago | parent | next [-]

Well, I've seen services describe having "E2EE" where one end is your computer and the other end is their server, so...

vaylian 15 hours ago | parent | prev | next [-]

The metadata is still unencrypted. That also reveals quite a bit.

gzread a day ago | parent | prev | next [-]

In the sense that calling it E2EE gives people a warm fuzzy feeling and makes people send more sensitive information over the platform.

Has anyone actually audited it?

babelfish a day ago | parent [-]

Probably their auditors? Lying about this would be tantamount to (very serious) securities fraud. Not sure what you're basing on your allegations on besides "trust me bro"

gzread 6 hours ago | parent [-]

Why would lying about having E2EE be securities (as in stock market) fraud? Would that make any lie ever told by a corporation equate to stock market fraud?

babelfish 6 hours ago | parent [-]

Yes! As Matt Levine says, “everything is securities fraud”

interestpiqued a day ago | parent | prev [-]

I mean you can read it in your app and they're not just stored on your phone. E2E just means in transport from what I understand.

SAI_Peregrinus a day ago | parent [-]

E2EE means end-to-end, where the ends are the participants in the chat. They can read it on your phone, but not on their servers. They need their app to separately transmit the plaintext to their servers to read it.

throwaway173738 a day ago | parent [-]

Which is technically possible.

markdown a day ago | parent | prev [-]

The first two E's in E2EE stand for end. From one end to the other. So no, Meta can't. Or put another way... if they can read those messages, then it's not E2EE.

sharkjacobs a day ago | parent | prev | next [-]

> The New Mexico attorney general’s office created multiple fake Facebook and Instagram profiles posing as children as part of its investigation into Meta. Those test accounts encountered sexually suggestive content and requests to share pornographic content, the suit alleges.

> The fake child accounts were allegedly contacted and solicited for sex by the three New Mexico adult men who were arrested in May of 2024. Two of the three men were arrested at a motel, where they allegedly believed they would be meeting up with a 12-year-old girl, based on their conversations with the decoy accounts.

and

> “The product is very good at connecting people with interests, and if your interest is little girls, it will be really good at connecting you with little girls,” Bejar said.

This is what it's about right? The article doesn't make it seem like encryption is meaningfully part of this case at all.

> Midway through trial, Meta said it would stop supporting end-to-end-encrypted messaging on Instagram later this year.

There's no indication that that decision, or the announcement, are directly related to the trial, just they just happened at the same time? It's a link drawn by CNN, without presenting any clear connection

zeeshana07x 9 hours ago | parent | prev | next [-]

Fines like this only work if they're large enough to change behavior. $375M for a company Meta's size is more of an accounting entry than a deterrent.

JumpCrisscross 5 hours ago | parent | next [-]

What is Meta’s revenue in New Mexico?

Also, “the total civil penalty of $375m was reached after the jury decided there were thousands of violations of the act, each with a maximum penalty of $5,000. Meta is also involved in a separate trial in Los Angeles, in which a young woman claims that she became addicted to platforms like Instagram and YouTube, owned by Google, as a child because of how they are intentionally designed.

There are thousands of similar lawsuits winding their way through the US courts.”

tremon 4 hours ago | parent [-]

Wait, what? This case's central argument was about propagating and promoting child sexual abuse material, but the maximum penalty was set to only $5000 per violation? Why?

JumpCrisscross 4 hours ago | parent [-]

> The jury found that Meta was responsible for violating New Mexico's Unfair Practices Act because it misled the public about the safety of its platforms for young users

“The jury found that Meta was responsible for violating New Mexico's Unfair Practices Act because it misled the public about the safety of its platforms for young users.”

So the penalty is for misleading around CSAM. Not CSAM per se. (My understanding is the latter are still being adjudicated.)

CabSauce 8 hours ago | parent | prev [-]

While true, this is just one pretty small state. There are others.

fny 7 hours ago | parent | prev | next [-]

This fine from New Mexico is about 0.6% of Meta's annual profit.

If all 50 states sue at the same rate, that'll be a 30% dent, and I'm sure states can sue for more than 0.6% too. That would be historic action against malfeasance and would send a strong FAFO single to all corporates.

Let's lobby for it.

rimbo789 6 hours ago | parent [-]

Why stopped at the 50 states? Loop in the rest of the world

exabrial 10 hours ago | parent | prev | next [-]

That fine is missing a few zeros on the right side

pluc 8 hours ago | parent [-]

It's not a fine it's a fee

tombert 5 hours ago | parent | prev | next [-]

They had to pay about $375 million. That's a lot of money, but I suspect that Facebook has made considerably more than that on targeting children.

I'm hardly the first person to use this logic, but if they make more money breaking the law than they have to pay in fines, then it's not a fine, it's a business expense.

conductr 5 hours ago | parent | next [-]

Agree with your take. However, to put more perspective on the amount I think you have to consider this is just in New Mexico so the per capita fine is actually quite large and (big) if it were applied similarly nationally or globally it could be a significant impact to their business forcing some change.

ryandrake 4 hours ago | parent | prev [-]

Proportionally, it's as if an individual who makes $60K/yr got a speeding fine of $375. Kind of a drop in the bucket.

tombert 4 hours ago | parent [-]

Especially if they were making $4,000 from street racing.

CrzyLngPwd 4 hours ago | parent | prev | next [-]

The fine is just one of the costs of doing business for these megacorps.

colordrops 3 hours ago | parent [-]

It's price in

sarbanharble 10 hours ago | parent | prev | next [-]

It takes 7 clicks to turn off ads that promote eating disorders. Thats enough proof.

dawnerd 7 hours ago | parent | next [-]

You can click infinite times not interested on dangerous or adult reels and they’ll just show up more and more.

criddell 7 hours ago | parent | prev [-]

What's an example of an ad promoting an eating disorder? Are ads for eating disorders more difficult to turn off than other types of ads?

bradley13 16 hours ago | parent | prev | next [-]

We don't want age verification, and we do want E2E encryption. Yet, because Meta is an evil company, we cheer on this judgement.

Reality, folks: you can't have both.

vaylian 15 hours ago | parent | next [-]

Those two things are unrelated to each other. And yes, we can do without age verification and we can have E2E encryption. Age verification is causing more harm than good. It also doesn't meaningfully help with any of the problems mentioned in the article.

pocksuppet 11 hours ago | parent | prev | next [-]

I think we don't want mandatory age verification or banned encryption for everything. However, you can't hide behind "it's not the law" as a shield for everything. Thanks to ubiquitous spyware, Meta knows damn well the age of almost all of its users, and if someone who's 40 is sending first-contact messages to 10 unknown 13-year-olds every day, it seems important to know what those messages say. They know this stuff is happening and they care about not being liable, not about your security.

We can assume Meta has backdoored its E2EE somehow anyway.

duskdozer 15 hours ago | parent | prev | next [-]

Well, assuming you won't also think it's okay for Meta to just be held liable anyway.

There are people who are against age verification just on principle and others who are against it because they know any realistic implementation is going to be abused.

dgxyz 16 hours ago | parent | prev [-]

Can we just agree we just don’t want Meta?

cedws 4 hours ago | parent | prev | next [-]

Wasn't Zuckerberg caught red handed in emails signing off on this? When is he going to be facing consequences?

munk-a 3 hours ago | parent | next [-]

Corporate liability isolation has become absurd. People who make decisions that harm people should be held to account for those decisions even if they structured their decision making apparatus in a legal way that makes it look like they're just following the orders of the shareholders.

Zuckerberg has a brain, he decided to take this action, it is absurd he is not being hit with a personal penalty.

etchalon 4 hours ago | parent | prev [-]

Consequences are for poor people.

ourmandave 12 hours ago | parent | prev | next [-]

Do we have to wait for any appeals before the performative mail out settlement checks for $1 routine?

rubyfan 12 hours ago | parent [-]

Or the settlements goes to the state and no one ever sees a dollar.

billfor 5 hours ago | parent | prev | next [-]

and also https://news.ycombinator.com/item?id=47514916 It might be good to roll all the comments together.

ehl0 4 hours ago | parent [-]

two separate cases.

inetknght 4 hours ago | parent [-]

Both articles cite a New Mexico case about the Unfair Practices act.

Though I don't see a link to a specific case in either article, I don't think they're separate cases.

ehl0 an hour ago | parent [-]

You're right sorry. I thought I was referring to this article

https://www.nytimes.com/2026/03/25/technology/social-media-t...

on this post

https://news.ycombinator.com/item?id=47520505

awongh 4 hours ago | parent | prev | next [-]

As part of the ongoing enshittification of the internet, tragedy of the commons etc., these big centralized internet platforms decided that instead of being responsible and making their products *slightly* less terrible it was better to maximize short term engagement metrics, and that, egotistically, the chance of there being real consequences for their actions was near zero. (Or, even more cynically, that their yearly performance review was more important).

Now I'm afraid they've screwed everyone over and the idea of an anonymous open internet is now dead- we're gonna see age (read, real ID) verification gating on every site and app soon....

The dumb thing is to look back and see how umimportant it is that Facebook feed algorithm be this addictive. They already had the network effects and no real competitors. They could have just left it alone.

cogman10 4 hours ago | parent | next [-]

What's horribly frustrating with the age ID stuff is that the issue at question with Meta wasn't that they didn't know what they were doing and that they were doing it to children. They did. This wasn't an issue of "If only they had the the age, then they could have done the right thing".

The laws being passed target exactly the wrong thing that wasn't a problem. They should have been passing "duty to care" laws aimed at social media companies not "give me your age" laws.

I may have missed it, but almost all these laws being passed for this issue have been pretty much solely around data collection rather than modifying the behavior of the worst businesses in the game.

It would be like seeing a car wreck kill a bunch of pedestrians and then passing a law that pedestrians need to carry IDs on them.

awongh 4 hours ago | parent [-]

Yea, in the end there will basically be no consequences for Meta- Facebook is already mostly dead, and the ad revenue from that time has already been collected.

Now we're just moving on to a kind of moral panic think-of-the-kids kind of moment that is thinly-veiled state surveillance.

basch 4 hours ago | parent | prev | next [-]

Watching Mark testify before the senate it honestly appears like it may have never occurred to him that it is an option to have not offered a feature. He treats the product as if it is some kind of inevitable outcome that was destined to exist.

cmoski 4 hours ago | parent [-]

It's not just avoiding any responsibility?

returnInfinity 4 hours ago | parent | prev | next [-]

Management comp is tied to numbers go up

You start slow, then push it the limits

Netflix, never ads to some ads, then eventually its just Adflix, after 20 years.

Each new manager wants that comp up. So ads up by 5% every year.

nclin_ 4 hours ago | parent | prev | next [-]

Mass surveillance 'for your own good' instead of regulating social media in any way.

You can purchase a scam ad it'll be up in 10 minutes. Lie to every anxious child they have ADHD and need meth, lie to every dejected boy that they just need to manosphere up and buy supplements.

They think the public is stupid. They might be right.

vharuck 3 hours ago | parent | prev [-]

>They already had the network effects and no real competitors.

Meta's biggest competitor was users' personal lives, not any other web service. They have been ruthless in crushing that competition.

HardwareLust 10 hours ago | parent | prev | next [-]

$375M isn't even a slap on the wrist for a company that raked in $60B last year.

Aboutplants 9 hours ago | parent | prev | next [-]

Why can’t penalties be tied to a percentage of Revenue?

vscode-rest 9 hours ago | parent | next [-]

You think if mom and pop shop did they same they’d be charged the same?

Ylpertnodi 9 hours ago | parent | prev [-]

GDPR.

elAhmo 8 hours ago | parent | prev | next [-]

They earn this in around 16 hours.

0ckpuppet 11 hours ago | parent | prev | next [-]

the leaders of these companies don'tlet their kids use it.

mrweasel 10 hours ago | parent [-]

I doubt that Zuckerberg really uses either Facebook or Instagram all that much. Maybe as a curated PR channel sure, but he's not doom scrolling Instagram at bedtime.

If you know what the platform is capable of, if you seen how the sausage is made, you're probably not using it.

People are also a little naive in not seeing that these platforms aren't just bad for children, they are bad for adults as well. I'm not oppose to not "selling" them to children, but we also need to label correctly for adults and have rules like those for alcohol, tobakko and gambling, so no or limited advertising. Scrub the public spaces of Facebook logos.

c-flow 10 hours ago | parent | next [-]

I'm not sure if it's naiveté, it's probably more that we are all complacent. If all Facebook/Instagram users (and perhaps, even if only those with children), stopped using, that would be an actual stick, wouldn't it.. But we don't (I'm not excluding myself).

vladms 9 hours ago | parent [-]

Deeper than that, it might be food for thought if someone can't stop doom scrolling. It does not matter the platform, if people are "addicted" to "bad news" it might be the person at the corner of the street ("the end is nigh! repent!"), the pharmacy next block or something else.

I personally stopped using Facebook because it was annoying me with useless doom and aggressive comments of people on stupid topics. If it would have showed me only cat pictures (like Instagrams does) or reasonable stuff (news, etc.) I would have continued using it.

kakacik 8 hours ago | parent | prev [-]

Discussions from proper experts about absolute toxicity of social networks in their implementation are at least... 15 years old at this point? At least that, and I am not talking about rare article here and there but onslaught of articles in popular media from all sides. But parents... mostly didn't give a fuck.

Lets admit it, in same vein trump is a symptom of current US society, the approach and effects of social networks we allow them to be is a result of how lazy and thus addicted people got. On top of many of the parents doing exactly the same, then don't expect miracles.

One thing that I don't understand - even here, some folks call that sociopathic amoral piece of shit 'zuck' and treat his empire like some sort of semi-charity. When I attacked facebook company in the past, there was always a lot of defense (look at this open sourced stuff, look at that... which I presume came from either direct employees or clueless stock holders). People are people, deeply flawed and often weak without willingness to admit it to themselves.

deepsun a day ago | parent | prev | next [-]

I cheer any decision that holds any private web property (like Facebook) accountable for it's user actions.

It helps to reduce hegemony of large social platforms and promotes privately owned websites. For example, I know everyone who has permissions to post on my website (or pre-moderate strangers comments), and is ready to take responsibility for their posts, what my website publishes.

Currently the legal stance seems strange to me -- large media platforms are allowed to store, distribute, rank and sell strangers data, while at the same time they claim they are not responsible for it.

vel0city 21 hours ago | parent [-]

If you haven't already, you should look at the court case that prompted the creation of the current legal framework of Section 230. Prodigy was sued because of the things being said in public chatrooms. Should the host for an IRC server be responsible for everything said on the IRC server? Should they pre-moderate all the messages being said there? Should dang premoderate every post on this site?

https://en.wikipedia.org/wiki/Stratton_Oakmont,_Inc._v._Prod....

ronsor 20 hours ago | parent [-]

The reality is that people who cheer for this stuff are going to be unreasonably shocked when it comes to bite them later. Once the government's done going after the big guys, the little guys are next, and unlike the big guys, they can't absorb a few fines and judgments.

elwebmaster 15 hours ago | parent | prev | next [-]

Can one be opposed to age verification in the OS and yet totally happy that Meta got this fine? There is a very big difference between e2e encryption /telephone and social media. Social media is more akin to a phone book. I do not recall there ever being any phone books listing minors. That's completely unacceptable and unnecessary. I am totally OK with phonebooks (or their modern digital equivalents which enable people discovery and user generated content discovery) to abide by the same KYC rules as banks. And be only for adults. Your kids using e2e encrypted messaging to communicate with their friends whom they have met in person? Nothing wrong with that, we all have the right to privacy. Kids listing their contact information publicly? Absolute no.

montroser 12 hours ago | parent | prev | next [-]

Cost of doing business...

sizero 11 hours ago | parent | next [-]

This. Meta made $60B in net income in 2025.

ryandrake 5 hours ago | parent | next [-]

Proportionally, it's as if an individual who makes $60K a year gets a speeding fine of $375. It might be moderately annoying, but it's not really going to be remembered in a month.

BrtByte 8 hours ago | parent | prev | next [-]

If you can make 60B and occasionally pay a few hundred million in fines, the math kind of answers itself

lynndotpy 11 hours ago | parent | prev [-]

Has anyone in leadership at Meta faced even the prospect of jail time for what they've done over all these years?

bdangubic 10 hours ago | parent [-]

they will get congressional medals of honor sooner than that

eqvinox 11 hours ago | parent | prev [-]

"We went a little over the line to figure out where the line is, so, we can now guarantee you, dear shareholder, that we're extracting the absolute maximum possible value! Isn't that splendid!"

groundzeros2015 8 hours ago | parent [-]

More like “we found a company doing business in the EU who has deep pockets. I bet we can get 500 mil from them and they won’t leave.”

patrickmcnamara 8 hours ago | parent [-]

Who issued this fine?

throw7 8 hours ago | parent | prev | next [-]

If Meta did advertise the "safety of its platforms for young users" then they should be held accountable for that. It seems clear from the whistleblowers that Meta had internal data that they knew they were not safe for young users, but Zuck gotta get those ads($$$) in front of young kids.

2OEH8eoCRo0 6 hours ago | parent [-]

Modern cigarette companies

CobrastanJorji 5 hours ago | parent | prev | next [-]

"We remain confident in our record of protecting teens online," said the company that clearly was not punished enough to hurt their confidence.

WarcrimeActual 5 hours ago | parent | prev | next [-]

I haven't read this article, but I can tell you for certain that no verdict was handed down that will punish them in any way that matters. They have and generate more money than they could ever spend and they're functionally above the law because of the money and lawyers they can afford. The law itself is broken in this country and when you get big enough you can literally get away with murder.

bovermyer 4 hours ago | parent | next [-]

If history is any indication, only demonstrable threat of personal erasure will affect the behavior of people on this scale.

By "erasure," I'm not referring to the death of the involved; I'm referring to the elimination of the individual's social capital.

When the privileged lose their ability to influence others, they tend to get rather distressed.

johnnyanmac 4 hours ago | parent [-]

How would we do that here? Make Zuckerberg divest from FB or Meta as a whole? Would that be possible?

WarcrimeActual 4 hours ago | parent [-]

Honestly he was more right with the death part. The only thing these people really fear is death. Anything else is a fine and a fine means nothing when you don't feel it.

worik 3 hours ago | parent [-]

I am repeating myself, but prison would be a good deterant

tikimcfee 5 hours ago | parent | prev | next [-]

+1. If there's a dollar amount attached to a verdict for a company of this size, then it's just a complicated business expense and not an enforcement of a law.

sharemywin 4 hours ago | parent | prev | next [-]

they should give voting stock out as punishment.

smuhakg 5 hours ago | parent | prev [-]

It's a $3 million verdict in compensatory damages. Even if reduced on appeal, that's a lot of money.

This is really bad for Meta.

dotancohen 5 hours ago | parent | next [-]

Meta has a net profit over $140 million _per day_. $3 million is absolutely nothing to them.

chimeracoder 5 hours ago | parent | prev | next [-]

> It's a $3 million verdict in compensatory damages. Even if reduced on appeal, that's a lot of money.

Where are you seeing that?

The article says:

> Jurors found there were thousands of violations, each counting separately toward a penalty of $375 million. That’s less than one-fifth of what prosecutors were seeking.

> Meta is valued at about $1.5 trillion and the company’s stock was up 5% in early after-hours trading following the verdict, a signal that shareholders were shrugging off the news.

> Juror Linda Payton, 38, said the jury reached a compromise on the estimated number of teenagers affected by Meta’s platforms, while opting for the maximum penalty per violation. With a maximum $5,000 penalty for each violation, she said she thought each child was worth the maximum amount.

john_strinlai 5 hours ago | parent | prev [-]

how many minutes of revenue is that?

they did $200 billion in revenue and $60 billion in net income last year.

a $3 billion fine would be barely more than a slap on the wrist.

danudey 4 hours ago | parent [-]

Until we start to penalize companies by percentage of global revenue rather than some arbitrary dollar amount that pales in comparison to their revenues this sort of stuff is going to keep happening.

$3m is nothing. 10% of global revenues (not profits) for each year in which this occurred would be something that might actually make them think twice about breaking the law and harming people for money.

thechao 4 hours ago | parent | next [-]

Once there's a pattern of abuse, you can go after the execs personally for purposes of the carrying out of justice. Courts don't like the idea of bad actors hiding themselves behind corporations. You don't even need to "piece the veil" — you just go straight for the Zuck.

WarcrimeActual 4 hours ago | parent [-]

>you just go straight for the Zuck.

Will literally never happen. It's impossible. I'm not talking figuratively impossible. At his level of wealth and influence, there are good odds he could murder someone on live stream and walk away. You are dangerously underestimating the influence the rich have in every aspect of society and law.

kevin_thibedeau 4 hours ago | parent | prev [-]

C-levels need to face real consequences. A ban on moving to a new executive position or serving on a board for 10 years would rapidly fix the systemic ethical problems.

tremon 5 hours ago | parent | prev | next [-]

"told to pay"? As in, they're not even fined? What a horrible choice of headline.

maqnius 4 hours ago | parent | prev | next [-]

Tststs.. it's only allowed to harm adults and the environment for profit.

schubidubiduba 4 hours ago | parent [-]

Don't forget democracy

badpenny 9 hours ago | parent | prev | next [-]

0.6% of last year's profits.

JumpCrisscross 5 hours ago | parent [-]

> 0.6% of last year's profits

New Mexico is 0.6% of the U.S. population [1].

[1] https://en.wikipedia.org/wiki/New_Mexico 2.13mm

[2] https://www.census.gov/popclock/ 342mm

mattfrommars 7 hours ago | parent | prev | next [-]

That’s good! We need to protect our children.

But who gets the $375 million dollars? Anyone know the cut the law firm will get from this incredible amount of money?

Alen_P 9 hours ago | parent | prev | next [-]

Most Facebook users are basically teenagers, so it's no wonder it took them this long to add any real restrictions...or maybe they just wanted us to think they cared.

1vuio0pswjnm7 2 hours ago | parent | prev | next [-]

Another item on the subject of this verdict that, at present, has more points is

https://news.ycombinator.com/item?id=47519625

One is a story by a journalist at CNN, the other is a story by a journalist at the LA Times

Multiple articles on the same topic can sometimes offer different facts and opinions, different perspectives

muskyFelon 9 hours ago | parent | prev | next [-]

Regulate and fine social media and adtech companies until its no longer economically feasible to generate the massive profits and stock valuations that is prompting this garbage.

gotwaz 8 hours ago | parent | next [-]

Just have to read the quarterly conference calls between Zuck and Wall Street. Both groups are in total denial. And will be till we never hear from Zuck ever again.

matheusmoreira 8 hours ago | parent | prev [-]

Just break them all up via antitrust enforcement. It's increasingly becoming clear that society will degenerate into cyberpunk technofeudalism otherwise.

fridder 5 hours ago | parent | prev | next [-]

I wonder if this stand and if it will lead to more suits against Meta.

nixass 11 hours ago | parent | prev | next [-]

Oh no those pesky Europeans extorting money from US tech companies. No, wait..

electric_muse 12 hours ago | parent | prev | next [-]

The same company intentionally driving minors towards this content (despite claiming to care about them) is also lobbying in secrecy for requiring all of us to scan our ID and face in order to use our phones and computers.

Their stated reason? Child safety.

Their actual reason? You can figure that out.

GuB-42 8 hours ago | parent | next [-]

The actual reason: child safety regulations

They don't care about child safety as long as it doesn't become so bad as to impact their revenue negatively. But they see that governments all over the world push for some kinds of age restrictions, and they know they are a prime target and it is hard for them to push back against that.

The reason they are (not so secretly) lobbying for requiring us to ID ourselves at the device level is that they don't want to be the gatekeepers. They want to make creating an account as effortless as possible and having to prove your age is a barrier that make turn off some people, including adults, and they may instead turn to services that don't require age verification. By moving the age verification in the OS, not only the responsibility shifts to the OS or hardware vendor, but it also removes the disadvantage they have against services that don't require age verification.

For a similar issue, PornHub is currently blocked in France, because they don't want to comply with the law related to age verification. Here is their argument: https://www.aylo.com/newsroom/aylo-suspends-access-to-pornhu...

If you read between the lines, you will see that they have the same stance: "put age verification at the OS level, so that people don't discriminate against us". They know they are not in a position to argue against "child safety" laws, so instead, they lobby for making it worse for everyone instead of just themselves.

zerotolerance 7 hours ago | parent [-]

The other "real reason" is the solution will end up looking like a super cookie and enable machine-level tracking across every app.

forkerenok 11 hours ago | parent | prev | next [-]

Meta is like one giant cancer that grew a few small tumors of benign[1] nature, like some of their efforts in open source and open research (React, Llama, etc.).

[1]: I could be wrong thinking those are benign.

kryogen1c 9 hours ago | parent | next [-]

>Meta is like one giant cancer

Cancer is a great metaphor because its a perversion of natural, healthy processes. So called social media is nearly that, but actually grotesquely unhealthy.

People are dramatically unwell when they are not social, but that unregulated process is also negative up to and including being lethal.

rolandog 8 hours ago | parent | next [-]

Exactly. It started out as something good: see what friends and family are up to. But now: scroll infinite algorithmically placed or sponsored rage bait trying to trigger you into behaving the way that advances certain corporate or foreign interests at the expense of whatever was left of our already tattered social fabric and our collective mental or literal health.

1over137 8 hours ago | parent [-]

> It started out as something good

No it didn’t. That was just like the first free sample from the drug dealer. Give a “good” free service to rope them in, always with the next steps in mind.

Quarrelsome 8 hours ago | parent | next [-]

I disagree. I feel like earlier social networks hadn't yet huffed the "lean startup" gas and weren't obsessed with engagement and thus were not yet trying to hook their users into an engagement cycle like where we are today.

I feel like the Myspace/Friendster and early Facebook were nowhere near as harmful (albeit for addiction, those sites were still vulnerable to grooming) as where we are today.

danny_codes 7 hours ago | parent | prev [-]

OG Facebook was perfectly fine. In your analogy it’d be more like someone replacing your Diet Coke with actual cocaine. Like, yeah Diet Coke isn’t great for you, but it’s not cocaine.

rel_ic 8 hours ago | parent | prev [-]

Being on "social media" is a fundamentally unsocial activity: you do it alone, it makes you lonely, and it separates you from others. Some people manage to bootstrap a social layer on top of the base medium, but most are being driven apart for profit.

I call it _anti_social media.

rdevilla 10 hours ago | parent | prev | next [-]

Facebook was the Eternal September of the Web. Netiquette died when it was made generally available, as did the culture that spawned it.

Aurornis 9 hours ago | parent | next [-]

I think you can tell approximately how old someone is by when they believe Eternal September started on the internet. Nobody believes it was when they started enjoying the internet. It was always when some other generation or service arrived after them.

The internet was not a calm and well behaved place before Facebook arrived. The original “Eternal September” was in the early 90s. Usenet, forums, Reddit, comment sections, and every other social part of the internet have been full of bad behavior long before Facebook came along.

ChrisMarshallNY 8 hours ago | parent | next [-]

Can confirm.

Source: I was a bad, bad, boi, on UseNet.

ghurtado 8 hours ago | parent | prev | next [-]

So many words and you missed the most important one: "netiquette"

That's the whole point: the word exists precisely as a testament to something that used to exist but now doesn't.

Anybody old enough to remember the word when it was common use should realize that it would have been impossible for the term to be coined in 2026.

If you missed that part of the Internet (maybe you were too young or maybe you were focused on other things, like the vast majority of people in the 90s), that's totally fine, but plenty of us did experience it and remember it pretty clearly.

> Usenet, forums, Reddit, comment sections, and every other social part of the internet have been full of bad behavior long before Facebook came along.

You can tell approximately how old someone is by whether they have reached the "everything sucks" part of life yet or not.

rdevilla 9 hours ago | parent | prev | next [-]

Hence... "of the web." IRC is and always was a cesspool but at least they had heard of netiquette, and it was something you could choose to partake in - or not, for the lulz. Nobody said anything about being "calm and well behaved" in particular.

plagiarist 8 hours ago | parent | prev [-]

Eternal September started before I was on the internet, but there have been several similar shifts since then.

It gets continually worse. Agentic AI is another Eternal September. For example, we now have dimwits sending dozens of unsolicited and unreviewed slop PRs to open source projects. Every search result is an affiliate marketing listicle obviously written by a robot.

h2zizzle 9 hours ago | parent | prev [-]

As a Millennial, I'm sad to say that it wasn't even older generations' fault, but our own (+Gen X). The tipping point was letting in normies who traded in photos and money instead of text and art.

rdevilla 9 hours ago | parent [-]

Elitism and selectivity were actually features of the early Internet. High barriers to entry (tech savvy, literacy) ensured that there was a high signal to noise ratio, and thus you had, let's say, upper quartile participants concentrated in one (forum of) fora.

LLMs are now heralding the Eternal September of even software engineering, and now I am wondering where to hang up my Techpriest robes in search of more elite pastures.

I wonder if this is how the clergy felt once the vulgar were allowed to study scripture not in the original spiritual programming languages of Hebrew or Latin, but English.

ghurtado 8 hours ago | parent | next [-]

> I wonder if this is how the clergy felt once the vulgar were...

You meant the "vulgus". "Vulgar" has the same root, but a very different meaning.

This random thought is kinda disconnected from actual human history. "Not allowed to study Scripture" was not a thing: Illiteracy was. There were people that knew how to read and people who didn't, that's it.

I'm trying hard (and failing) to visualize your mental image.

"Dear Father: it looks like the Bible has been translated to English by my dear brothers up at the monastery. I'm sure you understand why I can no longer be a priest"

Remember that you're living in the actual earth timeline, not the 40k one.

h2zizzle 8 hours ago | parent | prev | next [-]

Elitism and selectivity were actually features of the early Internet. High barriers to entry (tech savvy, literacy) ensured that there was a high signal to noise ratio, and thus you had, let's say, upper quartile participants concentrated in one (forum of) fora.

I disagree. I'm of the Neopets/Pokemon forums generation. Elitism and selectivity were not what made that era a good balance between the caustic free-for-all we have now and the rich kid's playground from before. It was the technical and practical restrictions on what you could put in and get out of a web experience.

You couldn't upload thousands of thirst traps every month, because storage was limited. You couldn't summon another head of the dropshipping or affiliate marketing hydras with a few clicks, because the infrastructure didn't exist. You couldn't inundate users with dark patterns designed to extract every ounce of attention, data, and cash possible, because the rich web wasn't that rich yet.

You had to deal in text and reasonably-sized images on a CRT with a limited-bandwidth pipe feeding it all. Because of this, many of the techniques developed to transform so many other forms of media and so many other institutions into Capitalist hellscapes and high school, respectively, didn't work online. Until they did.

foobarian 9 hours ago | parent | prev | next [-]

And Greek! Don't forget Greek

-emacs user

iugtmkbdfil834 8 hours ago | parent | prev | next [-]

I mean, one can always get an older machine and code everything as holy binary chant not only impress the youngsters, but also impose level of distance from the 'limited by llms'.

FWIW, I like the analogy despite seeing a benefit to knowing the original languages to studying scripture.

echelon 8 hours ago | parent | prev [-]

> I am wondering where to hang up my Techpriest robes in search of more elite pastures.

Capital and tech improvement will beat anyone chasing that.

mnw21cam 9 hours ago | parent | prev | next [-]

I think Zstandard would be the most benign example.

ozgrakkurt 9 hours ago | parent [-]

Zstandard was created by one amazing person. Pretty sure he would have done it even if meta didn't exist.

netfortius 10 hours ago | parent | prev | next [-]

A few weeks after they expanded access beyond .edu domains, I deleted my account. Haven't looked back since. Not an ounce of regret.

philipallstar 9 hours ago | parent [-]

Exactly. Why should furrin students get a look in?

SecretDreams 10 hours ago | parent | prev | next [-]

Everything consumer facing from meta is like a toxic waste hazard. It makes me sad seeing people stuck on those platforms.

tietjens 10 hours ago | parent | prev [-]

React benign? That’s the first time I’ve seen this suggestion on HN. Usually it’s held responsible for great crimes and wrongs.

muskyFelon 9 hours ago | parent [-]

Ha, I think the great crimes and wrongs title goes to Angular. I became a front-end guy specifically to avoid all the OOP verbosity. I'm just trying to call some APIs and render some data on a web page. I don't need layers of abstraction to do that.

Anyways, is there a "just use vue" effort like there is with postgres :)

tietjens 2 hours ago | parent [-]

I also found Angular to be a nightmare. I enjoy Astro, Svelte, even Preact can be fun. There are many to try. My comment above was just a joke, but I'm getting downvoted.

DivingForGold 9 hours ago | parent | prev | next [-]

Actually. Meta is spending millions to push the age verification requirement off to the app store providers, such as Google and Apple. It's an attempt to shield Meta from liability, transfer it to the app providers.

Ajedi32 9 hours ago | parent | next [-]

Having clear laws about what's allowed and what isn't is a lot cheaper than getting repeatedly sued for hundreds of millions for not doing things there was never a clear legal requirement to do.

miohtama 8 hours ago | parent | prev | next [-]

They are winning.

In the UK, you cannot use App Store and iPhone (your own phone) without verifying your identity:

https://x.com/WindsorDebs/status/2036727466597712008

Quarrelsome 8 hours ago | parent [-]

Google play store still works fine in the UK, so idk.

miohtama 7 hours ago | parent [-]

Just wait and see couple of months

Quarrelsome 6 hours ago | parent [-]

im not aware of any law that went through parliament that directly impacts installing apps. OSA has already hit and didn't impact app stores. Can you link me the relevant legislation or hansard debates?

simion314 9 hours ago | parent | prev [-]

>to push the age verification requirement off to the app store providers,

and makes more sense, Apple and Google have your credit card , or if you are a parent that bought soem phone for you child then at first boot up as a parent should be your job to setup a child account.

jprjr_ 8 hours ago | parent | next [-]

> at first boot up as a parent should be your job to setup a child account.

Something I would be 100% OK with is some regulation that at first boot, you have to present information about what parental controls are available on the device and ask if you'd like them enabled.

I haven't set up a phone in a hot minute, I only do it once every few years, is this something they already do?

I'd imagine there's a lot of cases where a parent buys a new phone and hands down the old one to their kid without enabling safety features. I don't know if there's a good way to help with that - maybe something like, whenever you go to set a new password, prompt "hey is this for a kid?" and go through the safety features again?

Just spitballing, that last one may not be a good idea, not really sure.

simion314 5 hours ago | parent [-]

Exactly, I did not seen such a screen, but this giants have the budget to hire UX experts to clearly design the initial setup to clearly ask if this device is for a child or if is for multiple users to make more accounts. Also to make happy the other guy that commented they could ask you if you do not want to sure adult content too and in that case set same flags int he system.

Seems such a simple solution rather then each appa nd website having to figure out a way to do it.

inetknght 8 hours ago | parent | prev [-]

> Apple and Google have your credit card

They don't have mine.

Even if they did, having a credit card is not proof of age.

> if you are a parent that bought soem phone for you child then at first boot up as a parent should be your job to setup a child account

Setting up a "child account" shouldn't involve setting some age field. Setting up a "child account" should involve restricting permissions.

Why leave it to the OS or a company to decide what is "age appropriate"? Leave it to the parent to decide what the child should or should not have access to. Extra bonus: that same "child account" can then also be used for other restricted purposes. Want a guest account which limits activity? Want an incognito account? Want a sandbox account? None of these should require setting some age.

simion314 8 hours ago | parent [-]

This shit already happened years ago with consoles, i setup a choild account and the games were restrcited and other features also.

I am not paid by a trilion dollar company to decide if it should be a birthday input, or a dropdown where you select your political and religious conviction about what your child should see. Sony figured it out, if Apple pays me I will spend more time to write for them a UX flow so average people could sert the accpunts up and the rest could ask their priest, cousins or other person that can follow instructions to setup the account for them.

The giants shoudl have solved this decades ago and not wait for the fanatic religious to push for this as laws and get the goverments involved, now you will get 25 different laws about this.

mhitza 11 hours ago | parent | prev | next [-]

Of course it's for the protection of the children!

Why else would they want to sneakily add facial recognition to smart glasses?! /s https://www.businessinsider.com/meta-ray-ban-smart-glasses-f...

Akronymus 11 hours ago | parent | prev | next [-]

My guess: to discriminate whether traffic is from a humam or bot to improve ad delivery metrics.

modo_mario 10 hours ago | parent | next [-]

Most sites are not going to implement this themselves. I think they're in prime position to become a key broker of identity in the same way that a lot of people already log in with their meta or google account to unrelated websites. They become very entrenched and get a ton of data that way.

As more and more people essentially lock themselves in with these identitybrokers tho I imagine it has a very stifling effect on speech tho. Imagine getting banned from those.

moolcool 10 hours ago | parent | prev [-]

Aren't they incentive to treat bot impressions as real?

Manuel_D 8 hours ago | parent | next [-]

Not quite. If it's widely known that bot impressions aren't being filtered out, then people are less likely to place ads with Meta.

iamacyborg 9 hours ago | parent | prev [-]

Not if they can charge more for “certified” human impressions

giancarlostoro 9 hours ago | parent | prev | next [-]

I mean, their telemetry crap is on a lot of apps too. I remember someone DMing me something very niche on Discord, and by chance I opened up Facebook, it gave me ads for that very, very niche thing I have never even looked up on Google, or Facebook, it was like IMMEDIATE. I opened up Facebook by chance, and voila.

The other one was the time I was speaking to my brother in law, who had just paved his driveway, he said "I could have used airport grade tar, but thought it was too much" and we were in front of his Nest security cam is the only thing I can think of, but the very next morning, I'm scrolling through Facebook, and sure enough, someone local is advertising airport grade tar. Why? I didn't google this, I only heard it from them.

There's some serious shenanigans going on with ad companies, and we just seem to handwave it around.

Coincidentally, I remember both experiences very very vividly, because this was the last time I used either platform in any meaningful capacity.

alexfoo 9 hours ago | parent | next [-]

> The other one was the time I was speaking to my brother in law, who had just paved his driveway, he said "I could have used airport grade tar, but thought it was too much" and we were in front of his Nest security cam is the only thing I can think of, but the very next morning, I'm scrolling through Facebook, and sure enough, someone local is advertising airport grade tar. Why? I didn't google this, I only heard it from them.

Option A: The Nest camera not only listened to the conversation and picked out "Airport Grade Tar" and decided it needed to show adverts about it to people, but the camera also identified you to the point it could isolate your FB account in order to serve you those adverts.

(I'm making some assumptions but...)

Option B: Your brother had done various searches for airport grade tar from his home (in order to know how expensive it was). You, whilst visiting his home, were on his Wifi and therefore shared the same external IP address, your phone did enough activity whilst at his house (FB app checked in to their servers in the background, or used Messenger, etc) to get the "thinking of buying airport grade tar" associated with his external IP address associated with your FB account that was temporarily on that IP.

I had a friend who was convinced that some device in his house was listening in on his conversations with his wife as he kept on getting adverts for things they'd been talking about buying the day before but he hadn't searched for. (But she was searching for it from their home wifi, which is why it appeared in his adverts afterwards.)

hexaga 8 hours ago | parent [-]

Option C: no cameras or crude wifi tracing needed; they know who you talk to / associate with based on location data and the full profile of both sides, and can estimate things like 'will have mentioned X' -> can dispatch that via heuristic like 'show ads for X thing that was also mentioned by someone adjacent on that social graph'.

That is, BiL was marked as 'spreader for airport grade tar' based on recent activity, marked as having been in contact with spreadee, and then spreadee was marked as having received the spreading. P(conversion) high, so the ad is shown.

It's just contact tracing, it works well and is really easy even without literally watching what goes on in interactions.

giancarlostoro 3 hours ago | parent [-]

Funnily enough, I was looking up Tamagotchis weeks and weeks back, my wife got an ad for them on Amazon.

GreenVulpine 9 hours ago | parent | prev [-]

No surprise there, Discord sells user data to Meta and X.

Permit 11 hours ago | parent | prev | next [-]

> Their actual reason? You can figure that out.

This is unfalsifiable. Just say what you think it is explicitly.

toss1 10 hours ago | parent | next [-]

Isn't this conversation, not publishing scientific hypotheses, theories and findings?

If so, it is customarily permissible to use rhetoric and sarcasm to more strongly emphasize a point. Or, to leave the conclusion as an exercise for the reader.

Permit 10 hours ago | parent [-]

By intentionally hiding their position (and simultaneously acting as though it is completely obvious) the OP shuts down any useful conversation that might follow. Do they think Meta will sell the user's data? Do they think different people are in charge of different policies at Meta leading to actions that appear to be in conflict with each other? Do they think they will use this information to train AI models? Do they think they will use this information to serve Ads?

There are many interesting ways that the conversation could have been carried forward but there is no way to continue the conservation as the OP doesn't make it clear what they think.

The only thing I can say is: No I cannot figure it out, please tell me what you're trying to say here.

latexr 10 hours ago | parent | next [-]

> The only thing I can say is: No I cannot figure it out

On the contrary, looks like you can:

> (…) sell the user's data (…) use this information to train AI models (…) use this information to serve Ads

Permit 10 hours ago | parent [-]

What’s the point in providing a rebuttal to these points (e.g. that Meta doesn’t actually sell data to anyone) if the OP can simply say “that’s not what I meant”?

They are taking a position that cannot be argued against or even discussed because they don’t make that position clear.

thomastjeffery 9 hours ago | parent | next [-]

You are the only one arguing here. Not every conversation is an invitation to argument.

latexr 8 hours ago | parent | prev [-]

> providing a rebuttal to these points (e.g. that Meta doesn’t actually sell data to anyone)

So one of your suggestions of what the OP could mean was something you explicitly don’t think is true and would argue against? That sounds like a bad faith straw man set up.

Perhaps it’s just as well that the OP didn’t provide one specific reason to be nitpicked ad nauseam by an army of “well ackshually” missing the forest for the trees.

You could, as the HN guidelines suggest, argue in good faith and steel man. The distinction between “selling your data” and “profiting from your data” isn’t important for a high level discussion.

Can you truly not see through Meta’s intentions? There are entire published books, investigations, and whistleblowers to reference. Zuckerberg called people “dumb fucks” for trusting him with their data and has time and again proven to be a hypocrite who doesn’t care about anyone but himself.

olcay_ 9 hours ago | parent | prev | next [-]

I think they meant that Meta is offloading the cost (fines) of farming minor's data onto the operating systems. With an up-front cost of 2 billion dollars in lobbying, they can avoid paying 300m+ fees regularly.

toss1 7 hours ago | parent | prev [-]

Or, OP is not hiding their position and shutting down conversation — they are not imposing their position and are opening it up to discussion.

What prevents you from saying "Yes, and Xyz!!" and another poster "Yup, and Pdq, and Foo too!"

Or, maybe OP is just being a bit lazy, but again, it seems the context is conversation, not formal scientific inquiry where everything must be falsifiable?

functionmouse 10 hours ago | parent | prev [-]

Why defend Zuck??

mystraline 9 hours ago | parent [-]

Cause on a website fellating CEOs and capitalism, "CEO's Lives Matter".

ahoka 10 hours ago | parent | prev | next [-]

Easy: regulation always favors incumbents.

isodev 10 hours ago | parent [-]

Only as long as corps are allowed to lobby or introduce financial incentives into policy making

gadflyinyoureye 10 hours ago | parent [-]

So any day ending in y for the US Congress?

intrasight 10 hours ago | parent | prev | next [-]

I can't figure it out so please enlighten me.

jprjr_ 8 hours ago | parent [-]

Basically these age attestation/verification laws are being pushed as a "save the children!" scenario. But if you read the laws - all they really do is shift responsibility around.

Currently, websites and apps are supposed to ensure they don't have kids under 13, or if they do - that they have the parents permission. That's federal law in the US.

These laws make the operating system or app store (depends on the particular law) responsible for being the age gate.

This doesn't stop the federal law from being enforced or anything, but the idea is apps/websites don't handle it directly, that's handled by the operating system or app store.

So now - companies like Meta can throw up their hands and say "hey, the operating system told us they were of age, not our fault." It also makes some things murkier. Now if Meta gets sued, can they bring Google/Apple/Microsoft in as some kind of co-defendent?

I think that murkiness is the point. They don't need to create the most bullet-proof set of regulations that 100% absolves them of all responsibility, they just need to create enough to save some money next time they get sued.

I can think of a ton of regulations we could create to better help protect kids. We could mandate that mobile phones, upon first setup, tell the user about parental controls that are available on the device and ask if they'd like to be enabled. Establish a baseline set of parental controls that need to be implemented and available by phone manufacturers, like an approval process that you need to go through to hit store shelves.

We could create educational programs. Remember being in school and having anti-drug shit come through the school? It could be like that but about social media (and also not like that because it wouldn't just be "social media is bad," hopefully).

Again all these laws do is take what should be Meta's burden, and make it everybody else's burden.

intrasight 6 hours ago | parent [-]

Forget about the stated reason for the laws. The fact is that it makes sense that people using a service are age-appropriate. And there is no market mechanism (I mean tort law) because of Section 230.

Now the easiest law change - that wouldn't required anyone to change anything - would be to revoke Section 230. This would make service providers liable. Everything else is a band-aid. I doubt that this verdict will survive appeal (due to Section 230). But if it does, then again there is no need for any new regulations. The tort lawyers will solve the problem for us.

If we do have device age verification, then it still doesn't shield Meta. The lawyers will sue everyone involved, and disclosure will show if Meta had data that will have shown that user should have been blocked.

The purpose of age verification is to avoid all this. Of course the current proposals suck and won't achieve this. The market will not accept an approach that would work - which would be for anything with a screen or speaker to be permanently tied to an individual user. "OS verification" cannot succeed - it must be one-time hardware attestation. Even a factory reset wouldn't remove the user assignment.

rdevilla 10 hours ago | parent | prev | next [-]

Just remember that these capacities will never be used to exonerate - only crucify.

Aurornis 9 hours ago | parent | prev | next [-]

> is also lobbying in secrecy for requiring all of us to scan our ID and face in order to use our phones and computers.

You’re conflating different things. The OS-level age setting proposals are not the same as scanning IDs and faces.

I’m anti age check legislation, too, but the misinformation is getting so bad that it’s starting to weaken the counter-arguments.

> Their stated reason? Child safety.

> Their actual reason? You can figure that out.

We’re commenting under an article about one $375M lawsuit over child safety and many more on the way. They are obviously being pressured for child safety by over zealous prosecutors. This is why they reversed course and removed end-to-end encryption from Instagram because it was brought up as a threat to child safety.

Also your “you can figure that out” implication doesn’t even make sense. The proposal to move age verification to the OS level would give Meta less information about the user, because the OS, not Meta apps, would be responsible for gating age content. I’m not agreeing with the proposal, but it’s easy to see that it would be more privacy-preserving than having to submit your ID to Meta.

dminik 8 hours ago | parent [-]

> The proposal to move age verification to the OS level would give Meta less information about the user, because the OS, not Meta apps, would be responsible for gating age content.

I find it hard to believe that meta doesn't already have a pretty good age estimate for 95%+ of their users.

What offloading the responsibility to the app stores (or OS vendors) gives Meta is exactly that, offloading responsibility. In a future lawsuit, they can say that someone else provided them with incorrect information.

BrtByte 8 hours ago | parent | prev | next [-]

I get the frustration, but I think it's worth separating two things: failing at moderation vs pushing for stricter identity controls

1337biz 7 hours ago | parent | prev | next [-]

It is most likely not them but they proxie for the US. Under another administration they would use an NGO to advance the agenda. The goal is to facescan the world.

noduerme 10 hours ago | parent | prev [-]

To be fair, they're just an evil corporation making lemonade out of lemons. I'm sure they'd be happier pushing porn and nazism to hundreds of millions of underage users, but if certain governments want them to write all that bunk code to verify everyone's ID, they might as well make money off the data.

philipallstar 9 hours ago | parent [-]

They're a lot more likely to push socialism than nazism. Hence all the socialism and the lack of nazism.

notnullorvoid 3 hours ago | parent | prev | next [-]

As usual the company is going to financially shield those responsible, while they in turn shield the company from societal blame.

girishso 7 hours ago | parent | prev | next [-]

Why do we call this company "Meta"? It's the same old "Facebook".

NERD_ALERT 7 hours ago | parent | next [-]

The name of the company is literally Meta. That’s why people call it that…

swiftcoder 6 hours ago | parent | prev [-]

Given that they just shuttered their "metaverse", I'm guessing we won't have to for much longer...

sayYayToLife 4 hours ago | parent | prev | next [-]

Does this mean Apple, Nintendo, and Disney are at risk too?

I would love to see some justice.

Cider9986 a day ago | parent | prev | next [-]

https://lite.cnn.com/2026/03/24/tech/meta-new-mexico-trial-j...

vpShane 7 hours ago | parent | prev | next [-]

Make the fine scale, and fit the severity of the issue. This should be $375 Billion not $375 Million. These are our future generations they're destroying.

SlightlyLeftPad 16 hours ago | parent | prev | next [-]

$375M - That’s it?!

rrgok 15 hours ago | parent [-]

It should be a couple of billions or 15% of the profit.

fragmede 18 hours ago | parent | prev | next [-]

So... end to end message encryption means meta can't see messages child molesters are sending to children.

lmm 18 hours ago | parent | next [-]

As it should. If they can read those messages they can read anyone's messages.

gostsamo 17 hours ago | parent | prev [-]

When you see traffic between 40 yo man and 12 yo girl which don't have any common social connections and the messages are initiated by the man, you don't have to crack e2e to suspect dickpicks.

vaylian 15 hours ago | parent | next [-]

> which don't have any common social connections

How would you actually know this? Facebook is a surveillance company, but they are not omniscient.

b112 17 hours ago | parent | prev [-]

So you want the platform to be creepier and investigate connections more intensely? And you want to intercede on an arbitrary method you just made up, without examining all traffic first?

I seem to recall someone taking pictures of their baby, naked, because it was sick, and emailing them to the doctor -- and having their Apple account terminated. Terminated, with the father being labeled a pedophile, and the police contacted (all automatically).

Everyone was quite upset. Everyone felt it was too intrusive.

Frankly, communication platforms have no business trying to police anything at all. I wouldn't want the phone company recording all my conversations, hunting for trigger words, and then contacting the police or cutting off my phone if I sad "bad word".

Yet somehow it's OK to have this level of intrusion because.. um "computers".

The state has no business listening in on private citizen's communication.

Corporations have no business doing so.

To protect the 12 year girl, something called "her parents" need to pay attention and watch what she does. That's their job. They're her guardian.

Some random corporation has no business in that. Some random corporation has no business being an 'algorithmic parent', an automated machine with no appeal.

Here's something I'd support -- a way for parents to prevent children from registering for accounts, and, to be able to examine children's accounts.

But... then we get into ID verification. Of course, surely you support ID verification for platforms, because if you support platforms knowing the age of people (40 and 12, you listed), then you therefore must support a way to verify those ages.

bluegatty 17 hours ago | parent | next [-]

" And you want to intercede on an arbitrary method you just made up,"

No, they literally identified a plausibly sensible policy flag, not some arbitrary action.

These flags are used in literally every system imaginable.

They they don't conform to some hard criteria, to your criteria, or to some working or ideological group's criteria is a bit besides the point.

Every system has these for good reason.

We have laws and regulations for all sorts of things to help people - including children and parents - in a complex society.

"The state has no business listening in on private citizen's communication."

The absolutely do, depending on circumstances. While Facebook is not a place for state monitoring, it's definitely in the public interest if they flag something that is 'very bad' by some reasonable criteria, so that the state can then act if necessary. They do so within the boundaries of the law subject to judicial oversight.

Facebook is a popular social network, a place that they want people to feel imminently safe. It's a Starbucks lounge without coffee - not a 'personal hyper protected zone'.

Other places, such as Signal, Telegram etc. can have different levels of privacy aka e2e given the different offering and expectations of privacy.

Facebook more or less wants to offer a relatively safe place where the kids can hang out, where they know crazy people are not going to attack their kinds. It's a community centre not a hacker zone.

If we can get past that, then we can move onto basic issues of privacy, advertising etc. which are damaging to everyone, especially young people, for which Facebook has perverse incentives.

b112 16 hours ago | parent [-]

"The state has no business listening in on private citizen's communication."

The absolutely do, depending on circumstances.

So primary is this concept of privacy, that it requires an entire legal framework, evidence of potential wrongdoing, proof that there is no other method to achieve the goal of validating guilt, proof that the crime is severe, and not a hunting expedition, approval via a warrant after a judge has examined that evidence, and strict controls around the entire usage of that warrant.

Wikipedia says:

Lawful interception is officially strictly controlled in many countries to safeguard privacy; this is the case in all liberal democracies.

Using this edge case as "depending on circumstances" is clearly not the generic I was referencing. The statement that

"The state has no business listening in on private citizen's communication."

Is valid, correct, accurate. Listing edge cases, is not invaliding the rule. It is the exception to the rule, and considering the sheer volume of communication, compared to the volume actively tapped in a legal means, it is the most edge case of edge cases.

There is no reason I would deem a mega-corp to somehow be OK to do what I would demand the state not. That our democratic societies have deemed that our states should not.

To highlight that, the phone companies of old would be in infinitely hot water, should they listen to communication between customers, in any fashion.

A platform is not a parent, should not police, should not act as an arm of the state, or as an arm of parents, except as I stipulated, by direct request of the parents, and only to enable the parents to be a guardian. Under no circumstances should that involve the platform scanning anything, instead, the platform could simply give parents direct access to a child's account.

bluegatty 16 hours ago | parent | next [-]

" that it requires an entire legal framework, evidence of potential wrongdoing, proof that there is no other method to achieve the goal of validating guilt"

No it doesn't.

Life is no Reddit, lawyers and technicalities.

It's made up of regular people in communities.

If you see some guy creeping on 10 year-olds, you can notify the police and Facebook will do that as well - for the same reason.

It may not at all need to involve 'state surveillance', and Meta can probably hand over whatever they want to the police in that circumstance.

The police can make a decision as to how to proceed.

A bit like if someone was harassing someone on the street.

Or if an unknown person starts hanging out outside by a schoolyard in a way that seems inappropriate.

We don't want to transgress people's rights but we also are going to look at 'negative signals'.

TheDong 15 hours ago | parent | next [-]

With e2e encryption, the signals you have are pretty minimal.

Let's say a 40 y/o man finds a phone on the ground, sees a name stuck on it, googles "name + town" and finds the facebook of a 12 y/o girl, and messages "Hey I found this phone, do you recognize it? <photo>"

With e2e encryption, you can't easily tell the difference between that and a creep.

This thread is advocating that exactly that case should result in a police visit with the assumption of guilt.

b112 8 hours ago | parent | prev [-]

You've quoted out of context, eliminating:

"The state has no business listening in on private citizen's communication."

So yes, the concept of privacy is so primary that the it requires an entire legal framework for the state to listen in.

--

In terms of the rest of your post, even though you quoted out of context, what you're saying is fine. But the people noticing things on the street, have nothing to do with those who maintain the roads. You really don't want corporations to have algorithms which mean they have to report trigger words to the police or state.

Instead, as I said, empower the parents. Legal guardians. It's their job to watch.

bluegatty 6 hours ago | parent [-]

" You really don't want corporations to have algorithms which mean they have to report trigger words to the police or state"

They already do.

The entire financial system, all of social media, and many organizations past a certain size.

I did not quote out of context - the commenter was missattributing context.

b112 37 minutes ago | parent [-]

I absolutely did not misattribute my own context, whatever that action means.

And some things are reported, others are not, point being, yes E2E isn't reported for obvious reasons. Loads of stuff isn't reported on social media; in fact, that's the absurd complaint against Meta!

And regardless of what is done now, that doesn't mean we want it. I didn't say it is or isn't done, I said "You really don't" want that. The more encroachment in that realm, the less free a people are.

gostsamo 16 hours ago | parent | prev [-]

You build the strallman to destroy. We are not talking state, we are talking the social network which advertises itself as safe to children, absolutely has metadata for approximate age and social connections, where one can identify as minor deserving protections, and which social network prefers to increase engagement at *any* cost to its users.

Barbing 16 hours ago | parent | prev | next [-]

>~~Apple~~

NYT: “A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged Him as a Criminal.” [2022]

as linked- https://news.ycombinator.com/item?id=45447606

gostsamo 17 hours ago | parent | prev [-]

So nice of you to know what I want without me even implying it using examples that are contrary to the conversation making arguments that you've invented for purposes that only you can imagine.

But just imagine that kids' accounts are coming with restrictions and privileges and when one account is marked as such, accounts marked as adult cannot initiate contact and the kids's data is automatically private, and those accounts cannot be comercialized under any shape or form.

b112 16 hours ago | parent [-]

So nice of you to know what I want without me even implying it

Yet you did imply it, as I said, by mentioning the age of the persons involved.

There is no accurate way to know age, without some form of identity or age verification. Presuming a child will have an account marked "child" is folly, for kids can just sign up without a parent's knowledge, creating a second account. If the goal is to actually protect and be a pseudo parent for the child, then actually ensuring that a child cannot have an adult account is part of that.

My point is, TSA style "we're doing things which look secure, but are not helpful and only inconvenient" isn't going to help. It will only give the appearance, not the actualized result of security.

gostsamo 16 hours ago | parent [-]

No, the child can mark itself as a child. The implication is in your imagination. There are lots of children (even most) which do not feel the need to upage themselves on the internet, but where they get only the downsides of children accounts and get utilized by the platform.

nomel 15 hours ago | parent [-]

> No, the child can mark itself as a child

and, an adult can mark itself as a child.

cwmoore 12 hours ago | parent | prev | next [-]

Seems insufficient to keep Social Security solvent after 2040.

Are the kids alright?

groundzeros2015 8 hours ago | parent | prev | next [-]

Lots of negative meta sentiment the past few months. Feeling a bit like 2021 and wondering if it’s time to buy?

paxys a day ago | parent | prev | next [-]

Happy to see it, but if a fine is the only consequence then they’re going to go back to doing the exact same thing tomorrow.

bergheim 5 hours ago | parent | prev | next [-]

What is so fucked up about this is that THEIR WHOLE RAISON D'ÊTRE is knowing more about you than you do.

You think they need this to know your age? Your gender? Your home, your birthplace, your political stance?

randycupertino 17 hours ago | parent | prev | next [-]

This is one of the first times the court found the platform itself can be liable, overruling frequent industry claims that they just host content and are never responsible for the content. $375 million sounds big but is peanuts compared to their annual revenue. And of course Meta will appeal and then try to drag everything out for years and years. Expect copycat lawsuits.

These platforms expose minors to predators and bad actors, and Meta was proven lying about safety.

WillPostForFood 17 hours ago | parent | next [-]

The state has a solution - force age verification for everyone on the platforms.

The state will ask Biedscheid to direct Meta to make changes to its platforms, including adding effective age verification

lostlogin 17 hours ago | parent | prev [-]

After you’ve been complicit in genocide, lesser charges are just not that shocking.

They immunised us.

NooneAtAll3 7 hours ago | parent | prev | next [-]

can someone explain how the fine size is calculated?

whamlastxmas 7 hours ago | parent [-]

Head of a chicken is cut off over a giant dart board, and wherever its headless body lands determines the fine

t1234s 10 hours ago | parent | prev | next [-]

Who is getting paid the $375m?

bilekas 9 hours ago | parent [-]

The state of New Mexico presumably as they brought the suit.

Ylpertnodi 9 hours ago | parent [-]

...so, not only the EU does this kind of thing.

kstrauser 8 hours ago | parent [-]

Sues companies breaking the law? I’m glad we still do some of that here.

andrewstuart 11 hours ago | parent | prev | next [-]

Age verification isn’t misleading is it?

csense 8 hours ago | parent | prev | next [-]

We used to believe in freedom of speech and freedom of association.

Since the dawn of the Internet era, we've had a legal principle that platforms are relatively shielded from liability for what their users do.

It's the Internet. There's sexual content and sketchy characters on it. Occasionally people will encounter them -- even if they're under 18.

Anyone who grew up in the mid-1990s or later, think back to your own Internet usage when you were under 18. You probably found something NSFW or NSFL, dealt with it, and came out basically OK after applying your common sense. Maybe it was shocking and mildly traumatizing -- but having negative experience is how we grow. Part of growing up is honing one's sense of "that link is staying blue" or "I'm not comfortable with this, it's time to GTFO". And it seems a lot safer if you encounter the sketchy side of humanity from the other side of a screen. Think about how a young person's exposure to the underbelly of humanity might have gone in pre-Internet times: Get invited to a party, find out it's in the bad part of town and there are a bunch of sketchy people there -- well, you're exposed to all kinds of physical risks. You can't leave the party as easily as you can put your phone down.

I stopped logging onto Facebook regularly around 2009; I only log in a couple times a year. I hate what Facebook has become in the past decade and a half.

But giving a site with millions of users a multi-hundred-million-dollar fine because some of those users behave badly seems...asinine.

If your kid is old enough and responsible enough to be given unsupervised Internet access, you'd better teach them how to deal with the skeevy stuff they might encounter.

danny_codes 7 hours ago | parent | next [-]

That’s not really true. Pre-internet we had relatively much stricter content controls. Fairness doctrine springs to mind, plus significant regulation of the movie industry.

Letting companies sell addiction has pretty significant negative externalities. That’s why we regulate gambling and drugs. Facebook sells addiction, so it makes sense to regulate it like we do drugs and gambling.

BrtByte 8 hours ago | parent | prev | next [-]

I think the difference is scale and targeting

Dotnaught 8 hours ago | parent | prev [-]

>we've had a legal principle that platforms are relatively shielded from liability for what their users do.

...when they've made a good faith effort to address harms.

dangus 7 hours ago | parent | prev | next [-]

This is less than 4 days of profit.

xvxvx 3 hours ago | parent | prev | next [-]

Until the fines are large enough to impact business and cause heads to roll, and maybe we even see some prison time for executives, companies will continue to not give a fuck. This is chump change for Meta.

jazzpush2 4 hours ago | parent | prev | next [-]

Name and shame the managers and leadership at this time. I dream of a world where they'd be recognized and shamed in the streets for all the damage they've done to society. Instead they get to do all kinds of side quests with their money.

notnullorvoid 4 hours ago | parent | next [-]

I'd much rather they get personally fined and/or banned from holding leadership positions in the field (with varying timeframes depending on the level of responsibility).

Naming and shaming won't do much good. It could backfire and serve as a positive mark on their resume for other morally corrupt leaders.

worik 3 hours ago | parent [-]

Short prison sentences would be a good deterant for white collar crime, rather than fines.

notnullorvoid 3 hours ago | parent [-]

I think it depends on the level of responsibility. If orders came from above, then sure through those most directly responsible for the order in prison. I also think the lower level leaders should be held accountable for relaying the orders, which is where I think a "can no longer hold this position of authority for ~2 years" punishment might be appropriate.

forgetfreeman 4 hours ago | parent | prev [-]

meh. hit the C suite and the board with life-altering punitive damages.

zombot 8 hours ago | parent | prev | next [-]

Still just a drop in the bucket compared to their quarterly profits. When will regulators get wise?

Beefin 8 hours ago | parent | prev | next [-]

This is a good flag that you should be rolling your own safety checks. It's not hard, here's a writeup of an ancillary problem/solution: https://mixpeek.com/blog/ip-safety-pre-publication-clearance

m3kw9 9 hours ago | parent | prev | next [-]

Calculated risk cost by them

intended 11 hours ago | parent | prev | next [-]

This particular verdict is a long time coming. How it drives meaningful change is the bigger question.

One of the challenges we need to resolve is the race to the bottom for online communities - engagement metrics will always result in a PH level that supports more acerbic behavior.

There’s multiple analyses that you can find, if not your own experience, to believe that we should be able to do better with our information commons.

Just today, I found a paper that studied a corpus of Twitter discussions and found that bad-faith interactions constituted 68.3% of all replies (Twitter data).

The engineer and analyst side of us will always question these types of analyses.

I’ve read enough papers at this point for the methods to matter more than the conclusion.

1) meta, and the other tech platforms need to open up their research and data. NDAs and business incentives prevent us from having the boring technical conversations.

2) tech needs someone else to be the bogeyman - the way we did for tobacco. The profit incentive ensures profitable predatory features pass review. Expecting firms to ignore quarterly shareholder reviews for warm fuzzies is … setting ourselves up for failure.

Regulators (with teeth) need to be propped up so that the right amount of predictable friction (liability) is introduced.

3) tech firms need an opportunity or forum to come clean. The sheer gap between the practical reality of something like content moderation vs the ignorance of users and regulators - results in surprise and outrage when people find out how the sausage is made.

4) algorithm defaults decide the median experience for participants in our shred market place of ideas. The defaults need to be set in a manner that works for humans and society (whatever that might be).

Economies are systems to align incentives to achieve subjective goals.

NickC25 5 hours ago | parent | prev | next [-]

1. This fine is 1/100th the size it should be. Make them pay, and break up Meta/facebook. 2. Age verification pushes coming from several different actors across gov't and private sector is worrying. I trust no actor here, and neither should you. 3. Zuck should be in jail.

luxuryballs 9 hours ago | parent | prev | next [-]

and who gets that money ^^

franrai a day ago | parent | prev | next [-]

What about X?

fuzzfactor 8 hours ago | parent | prev | next [-]

I don't know who they have to pay it to but that's only for New Mexico, which has about two million people which works out to about $187.50 per person.

That's pretty cheap when it comes to deception.

The eyes of Texas should be upon this, which is 15X the size and should not settle for less than $1000 per person, where deceptive trade practice is much more serious than other places.

Now that would set a $30 billion example which may not be enough of a deterrent either.

But there are probably plenty of people for whom a $5000 one-time payment might not come close to being fair compensation for what's already happened, especially with Meta allowed to continue as an ongoing concern, that's got to be psychologically harmful.

To really fix it each state would have to follow "suit" while greatly upping the ante so there's at least hundreds of billions at stake.

Meta can afford it and who else is responsible for so much widespread sneaky deception at this scale for so long ?

NickC25 5 hours ago | parent [-]

>Now that would set a $30 billion example which may not be enough of a deterrent either.

Mark's personally worth more than 10x that, Facebook's got a 1.7 trillion market cap, so it really wouldn't move the needle for them. Cost of doing business and whatnot.

quux 10 hours ago | parent | prev | next [-]

“Pay them, in the scheme of things it’s a speeding ticket”

salawat a day ago | parent | prev | next [-]

So... Question. Seeing as Zuck is the majority voting shareholder and highest ranked executive, why isn't there a piercing of the corporate veil going on? This isn't some distributed blame case. Ultimately, his decision making led to what the jury finds objectionable. I find it absurd that somehow, the corporate veil is able to absorb even this? Somebody accepted the risk. That somebody is at the top of the pyramid. Want to send a message? Get 'em.

anthk 10 hours ago | parent | prev | next [-]

Now sue them for lobbying against GNU/Linux with CSA, their front lobby.

johnea a day ago | parent | prev | next [-]

Another poster child for Meta's lobbying (bribery) to encourage OS level age verification. (numerous recent references in HN posts)

They very much want to push this liability off onto someone else...

As far as end-to-end encryption, on SM sites (social media or SadoMasochism, however you want to read it) I don't really see the need.

Aurornis a day ago | parent | next [-]

> As far as end-to-end encryption, on SM sites (social media or SadoMasochism, however you want to read it) I don't really see the need.

You don't see any benefit to allowing people to encrypt their private communications in a way that can't be accessed by the company?

It's weird to see tech news commenters swing from being pro-privacy to anti-privacy when the topic of social media sites come up.

gzread a day ago | parent [-]

Meta has a way to read your E2EE messages. I don't know what it is, but if they didn't then they wouldn't do it.

There's a difference between E2EE between friends who want to remain secure, and E2EE between strangers in an attempt for the platform to avoid legal liability for spam.

thorncorona 18 hours ago | parent [-]

for an account 41 days old you participate in a lot of controversial topic threads.

gzread 6 hours ago | parent [-]

Please see the posting guidelines.

tzs a day ago | parent | prev | next [-]

> Another poster child for Meta's lobbying (bribery) to encourage OS level age verification. (numerous recent references in HN posts)

The references I saw showed Meta had lobbied for some of the laws that require age verification be done by the site or by third party ID services. They did not show that Meta lobbied for any of the OS bills.

Some showed that Meta had lobbied in some of the states with those bills, but they just showed Meta's total lobbying budget for those states.

kstrauser a day ago | parent | prev [-]

You were downvoted, but right. Meta wants to be able to say, "hey, the OS said she was 18!" and not get in trouble for it.

Online child exploitation should be a strict liability offense.

idle_zealot a day ago | parent [-]

How does this apply to, say, Signal?

gzread a day ago | parent [-]

That's why Signal requires a phone number. You can't talk to people you don't know because complete strangers don't give you their phone number. And if you do spam random numbers, they'll report you to the police and you can be tracked down based on your identifier, which still doesn't leak the chats between you and people you actually know.

gnabgib a day ago | parent [-]

No.. it doesn't. https://freedom.press/digisec/blog/signal-identifiers/

cynicalsecurity 10 hours ago | parent | prev | next [-]

As much as everyone hates Meta for selling people's personal data, this is absolutely ridiculous. The hysteria regarding forcing companies do parents' job doesn't make any sense whatsoever.

danny_codes 7 hours ago | parent | next [-]

By this logic tobacco companies did nothing wrong when they pretended like smoking didn’t cause cancer for decades. Misleading users is harm.

bilekas 9 hours ago | parent | prev | next [-]

Requiring ID to browse the internet is doing the parents jobs of managing what their kids are doing online.

Stopping misleading advertisments and mental health issues while claiming to be protecting children is not on the parents. The parents were given the false information to believe their kids would be safe.

cynicalsecurity 5 hours ago | parent [-]

I've never seen Meta advertising themselves as a kindergarten or a playground for kids. They have always been perceived as public square or forum. It's wild to leave your child alone in public place and expect safety.

tartoran 10 hours ago | parent | prev [-]

Oh please! It’s not about parenting, it’s a cancer on society and now affecting the youngest and also the seniors.

RagnarD 11 hours ago | parent | prev | next [-]

Drop in the bucket for them. Giving Zuck some jail time would be the more appropriate message - there's no doubt he knows and approves of the kind of evil activity the New Mexico law enforcement dug up.

deepvibrations 10 hours ago | parent [-]

That would be a dream, but cannot see it happening. But totally agree with your theory- platforms should face genuine legal exposure for algorithmic harm to minors (as tobacco companies did for health harm).

Unfortunately, as we found out recently, Meta's lobbyists are a powerful force to contend with and I do not trust our governments to stand up to them.

shevy-java 10 hours ago | parent | prev | next [-]

Meta should be disbanded for the damage it caused to mankind. Age verification tainting Linux also is heavily attributable to Meta buying legislation; systemd already quickly went that path, in order to appease their corporate-gods. Private user data to be released to random actors willy-nilly style - and the constant appeasement "no, this is not what is happening". Until it suddenly is happening precisely as people predicted it to be happening. Everyone runs a meta-agenda nowadays, Meta more than most others.

2OEH8eoCRo0 10 hours ago | parent | prev | next [-]

Repeal section 230

kyledrake 7 hours ago | parent | next [-]

Careful what you wish for https://www.techdirt.com/2020/06/23/hello-youve-been-referre...

kstrauser 8 hours ago | parent | prev [-]

Why do you dislike the Internet?

2OEH8eoCRo0 8 hours ago | parent [-]

I love the internet. I hate what a lack of liability for platforms has done to the internet.

kgwxd 9 hours ago | parent | prev | next [-]

Shareholders: Worth it!

androiddrew 11 hours ago | parent | prev | next [-]

Alternative headline: household spyware cash machine forced to pay $20 for being bad.

If you want to punish Meta then you have to punish the wonder boy who runs it. Not even share holders can fight off the guy spending 80B on the metaverse.

jazz9k 4 hours ago | parent | prev | next [-]

lol. And you think we will ever legalize drugs (and people can take responsibility), when large companies are being sued for being addicted to social media?

superxpro12 4 hours ago | parent | next [-]

There's a vast difference between accurately advertising the effects of drugs and the risks involved in taking them, versus lying to you about the drugs and creating an environment that furthers addition.

It all boils down to consent.

I might want to take some drugs that have some harmful side effects. But i knew about them and i willingly made the choice because I valued the high more.

Contrast this with, I knew about the harmful side effects and told you they didnt exist and you should take more. And then i change the drug so its even MORE harmful because it also makes you BUY more. That's what these social media sites do.

They use engineered sociology and psychology to create addictive products, and then refine them to maximize profit at the cost of anything they can pull a lever on.

What bothers me the most is not the vampires at the top sucking out every dollar they can extract out of vulnerable people, but the fact that so many engineers are supporting this. So much for engineering ethics. Why even bother teaching it anymore?

mlyle 4 hours ago | parent | prev [-]

If you take actions to deliberately weaponize your product against children in particular, whatever it is -- you shouldn't be surprised when liability attaches. That's what this verdict is about.

josefritzishere 6 hours ago | parent | prev | next [-]

Meta can do more and should do more. I think that's the short of it. The company made 59 Billion last year. It's completely reasonable to expect that they expend effort and budget on reducing their harm to children.

kevincloudsec 6 hours ago | parent | prev | next [-]

the fine is 0.6% of last year's profit. the lobbying budget probably costs more.

rimbo789 9 hours ago | parent | prev | next [-]

That penalty is about a couple orders of magnitude too small

cs702 9 hours ago | parent | prev | next [-]

That's peanuts.[a]

[a] https://dictionary.cambridge.org/us/dictionary/english/peanu...

slazien 3 hours ago | parent | prev [-]

Why do we have prison sentences for insider trading, which is arguably (much) less harmful to the society, and not for this?

rishabhaiover 3 hours ago | parent | next [-]

because the damage done is relatively objective?

slazien 3 hours ago | parent [-]

Is that the only factor? Is insider trading objective? (hint: it's not, read the law). It's objective only when we can attribute a quantitative measure to it? What's the relative "value" of $1M profit from insider trading vs a single child's destroyed psyche? How much value could that child have contributed to the society had it not been for the harm done to it? Is there really much subjectiveness in terms of the harm done to those kids?

All that to say: I don't think "objectivity" should be the (main) factor resulting in existence of adequate punishment.

fnord77 3 hours ago | parent | prev [-]

Insider trading is incredibly toxic to society. It is not a victimless crime. It is tantamount to stealing.

slazien 3 hours ago | parent [-]

It is, I agree. My point is that the proportionality of consequences is not there. We seem to be good at criminalizing discrete, individual financial acts, but not systemic corporate decisions that cause diffuse harm. That's even when the aggregate harm is arguably far greater.