Remix.run Logo
Landmark L.A. jury verdict finds Instagram, YouTube were designed to addict kids(latimes.com)
283 points by 1vuio0pswjnm7 5 hours ago | 194 comments
bogdanoff_2 3 hours ago | parent | next [-]

The solution to this would be a law forcing these sites to allow third-party suggestion algorithms, so that you can choose who and how content is being suggested to you.

It could be perhaps as simple as allowing third-party websites and apps for watching Youtube on your phone. And it's okay if this would be a premium paid feature, so there's no counter argument that "it costs them money to host videos".

This is not an entirely new idea either. Before Spotify became popular, people would integrate Last.FM into their media players to get music recommendation based on their listening history, and you could listen to music via YouTube directly on the last.fm website.

mentalgear 3 hours ago | parent | next [-]

The solution to all of Big Tech's monopolies is actually pretty simple: Interoperability must become a law - this includes using custom algorithms or allowing other platforms (like your own app) to access YOUR data on whatever platform 'hosts' it.

Cory Doctorow wrote a great article on it:

"Interoperability Can Save the Open Web" https://spectrum.ieee.org/doctorow-interoperability

> While the dominance of Internet platforms like Twitter, Facebook, Instagram, or Amazon is often taken for granted, Doctorow argues that these walled gardens are fenced in by legal structures, not feats of engineering. Doctorow proposes forcing interoperability—any given platform’s ability to interact with another—as a way to break down those walls and to make the Internet freer and more democratic.

Most notably, he retells how early Facebook used to siphon data from its competitor MySpace and act on user's behalf on it (e.g. reply to MySpace messages via Facebook) - and then when the Zuck(er) was top dog, moved to made these basic interoperability actions illegal by law to prevent anyone doing to him what he did to others.

theturtletalks 2 hours ago | parent | next [-]

We can’t depend on these platforms to offer interoperability or even laws to force them to do so. The DMA forced Apple to allow 3rd party app stores in Europe and they still hampered it so rarely anyone uses it.

We need platforms to offer that interoperability and simply connect to these “marketplaces.” Take Shopify for example, sellers use that platform to list on Amazon, Google Shopping, TikTok shop, etc. We need open source alternatives to those where the sellers own the platform and these marketplaces are forced to be interoperable or left behind by those that are.

For Facebook, Instagram, Twitter, each person having their own website where they post and that post being pushed to these platforms is also another way to force interoperability on them or be left behind.

It’s a tall task, but achievable and it will happen given enough time.

pegasus 2 hours ago | parent [-]

> For Facebook, Instagram, Twitter, each person having their own website where they post and that post being pushed to these platforms is also another way to force interoperability on them or be left behind.

There's an acronym for this: POSSE (Publish [on your] Own Site, Syndicate Elsewhere). Part of the IndieWeb movement, for those who want to explore this worthwhile idea further.

spl757 38 minutes ago | parent | prev | next [-]

Breaking up these monopolies would be a good start. We aren't supposed to have those. There used to be something we called "regulations" but they got rid of that part I think. Elections have consequences.

FuriouslyAdrift 2 hours ago | parent | prev | next [-]

That just leads to embrace/extend/extinguish

spwa4 2 hours ago | parent [-]

Exactly. The deal of all these platforms is that there is a fuckton of up-front costs. Hard drives. Networks. Peering. Transit. Operators. Payment. Lawyers. SREs. And so on and so forth.

The solution to this used to be that governments provide the platform. You would think this wouldn't be hard to do, since people have now shown that this can work and so it's a guaranteed money maker, or as close as you're going to get.

Yet I can't find a single initiative.

So any such rules will just make all internet platforms disappear ... and nothing.

mschuster91 3 hours ago | parent | prev [-]

The foundational problem with interoperability is that it can and will immediately be abused by bad actors as long as there is no price tag attached to every piece of communication.

Among social media, Mastodon (and anything Fediverse) has it the worst, obviously, but Telegram and Whatsapp are rife with spams and scams, Twitter back when it still had third-party apps was rife with credential and token compromises (mostly used to shill cryptocurrencies).

As for the price tag reference - we've seen that with SMS. It used to be the case that sending SMS cost real money, something like 20 ct/message. It was prohibitively expensive to run SMS campaigns. But nowadays? It's effectively free at scale if you go the legit route and practically free if you manage to get someone's account at one of the tons of bulk SMS providers compromised. Apple's iMessage similarly makes bad actors pay a lot, because access to it is tied to a legitimate or stolen Apple product serial.

mindslight 6 minutes ago | parent | next [-]

But bad actors already do this, as there is a monetary incentive to implement adversarial interoperability, coupled with an incentive to not scale it up too much (lest that implementation get cut off sooner). For example, I certainly don't think all of the spam ads I see on Faceboot Marketplace are from individual people manually creating accounts and typing them out.

banannaise 2 hours ago | parent | prev | next [-]

Paywalls can have the opposite of the effect you want. Implemented incautiously, they can fail to disincentivize parties who can make profit in excess of the cost, and it can succeed at disincentivizing genuine, non-profit-motivated interaction.

Imagine how much less you would use text messages if they still had a per-message cost.

malfist 2 hours ago | parent [-]

I would reply to your comment, but my 2GB data allocation for my cell phone is already spent this month.

shimman 2 hours ago | parent | prev | next [-]

Because some hostile entity might rat fuck the a slightly better system, we're destined to use the same current shitty system because something better might have a downside?

Do you understand that this is all literally made up? The rules can change anytime and society can exert its will to make better world rather than letting a dozen people decide how technology will shape humanity (mostly in a negative capacity if you look at the current state of things).

pixl97 2 hours ago | parent [-]

>Because some hostile entity might rat fuck the a slightly better system,

And make it a worse system, is what you happened to leave off.

>Do you understand that this is all literally made up

You mean the existing system that evolved from billions and billions of interactions? Explain what is 'made up' about it.

The thing is if you start 'making up' random ass laws that piss people off, they will run screaming back to the billionaires to pwn them with locked down systems. Apple is a great example here. Shit is locked down and people love it.

shimman 2 hours ago | parent [-]

Being afraid to do things because they might possibly, but never proven, be worse is just the political machinations of enforcing the status quo where our corporate overlords get to dictate how technology shapes our lives.

I'm sorry but that's deeply undemocratic, todays generation should have a direct say in how new things effect their lives.

Failure to do this might literally condemn our species to extinction, and this only took less than 200 years to achieve. I'm sorry but they've proven their failure and it's time to make drastic changes.

Good news is many people agree with this across the electorate, so now you get to decide which people you want shaping society. The previous world order of US imperialism is going to end and I rather have the people decide what to do than those that want to continue running head first into extinction.

pixl97 an hour ago | parent [-]

>The previous world order of US imperialism is going to end

I don't disagree.

Of course Chinese imperialism probably won't be much better.

monarchwadia 2 hours ago | parent | prev [-]

This is a confusing comment. Interoperability and bad actors are separate concerns, because you get bad actors in systems of all kinds, not just in interoperable systems. Paywalling a system does not necessarily mitigate bad actors, either.

tshaddox 3 minutes ago | parent | prev | next [-]

This seems like a clever (but perhaps overly clever) amendment to Section 230 protections for social media.

However, I've always thought that it's pretty bizarre for Section 230 protections to apply when the social media company has extremely sophisticated algorithms that determine how much reach every user-generated piece of content gets. To me there's really no distinction between the "opinion" or "editorial" section of a traditional media publication and the algorithms which determine the reach of a piece of user-generated content on Twitter, YouTube, etc.

burlesona 11 minutes ago | parent | prev | next [-]

I think a better solution would be to repeal section 230 protection for any kind of personalized or algorithmic feed. The algorithm makes you a publisher, and you should be liable for what you publish.

That would make it very hard, nigh impossible, for a platform like YouTube or TikTok to exist as it does today, and would instead favor people self-curating mechanisms like RSS readers etc.

mastax 8 minutes ago | parent | prev | next [-]

That’s like saying the solution to cigarettes is that tobacco shops must be forced to sell clove cigarettes as a not-addictive alternative.

xp84 an hour ago | parent | prev | next [-]

Anything that’s a premium paid feature will be irrelevant. Most people don’t subscribe to YouTube premium, even though they know their kids are watching a ton of ads. Adoption has also been incredibly brisk on the ad tiers of the formerly ad-free TV services like Netflix and Hulu.

I realize “less addictive algo” is a different thing to pay for than removing ads - but it’s, if anything, an even harder sell - I think the layperson wouldn’t even acknowledge that they are vulnerable to being psychologically manipulated. They think they spend so much time on these apps because it’s so enjoyable.

From most parents’ point of view, paying a monthly bill for their children to have a less toxic experience on TikTok, or YouTube will be considered an extravagance instead of a responsible safety expense.

ceejayoz 3 hours ago | parent | prev | next [-]

It seems likely that'd result in even worse suggestions becoming the norm as people adopt the third-party that gives the quick dopamine rush. It's like suggesting tastier heroin to fix drug addiction.

bitwank 2 hours ago | parent [-]

Certainly not. People don’t want the slop they push, the anxiety provoking, salacious, clickbaity spam that it has devolved into. Anybody that used YouTube before the last few years can tell you the difference is pretty major. This is not content people want, it’s content that maximizes clicks and ad sales.

ceejayoz 2 hours ago | parent | next [-]

> People don’t want the slop they push…

That's also true for heroin. Plenty of people really want to break the addiction.

The slop exists because people are attracted to it.

bitwank 2 hours ago | parent [-]

Heroin is a different business model than advertising. Respectfully, you are wrong.

ceejayoz 2 hours ago | parent [-]

Gosh, if you say so...

pixl97 2 hours ago | parent [-]

Heh, it's funny watching people, like the one above you, say "This thing is addictive because it is a real object, but this digital object cannot be addictive at all". The argument is so illogical you begin to doubt you're talking to a real person.

SpicyLemonZest 2 hours ago | parent | prev [-]

People don't want to want it. But it's not obvious that merely allowing a choice of recommendation algorithms would allow people to escape the slop. Isn't anyone strong enough to choose a less addictive algorithm necessarily strong enough to not scroll Instagram for hours in the first place?

another-dave an hour ago | parent [-]

I mean, the court case is about these platforms being addictive to kids, so if they said "accounts for users under X years have the algo and time caps delegated to their parents' account by default" it'd go along way to negate what they're being accused of.

They've already built all the tools they need around this at the moment, it's just they give them to advertisers rather than end-users.

ceejayoz 39 minutes ago | parent [-]

"Let the parents manage it" is, unfortunately, part of the reason we're in this situation in the first place.

butlike an hour ago | parent | prev | next [-]

Or just stop suggesting content. The landing page is just a matrix of already followed accounts with the text "Start by following some accounts you like..." as a placeholder if it's a new account.

theptip 2 hours ago | parent | prev | next [-]

I’m quite bullish on disintermediating the algorithms. AI makes it very easy to plug in your own. We just haven’t figured out the plumbing yet.

I’d be strongly in favor of interoperability laws to pry open the monopolies.

(One dynamic you do need to be careful about especially at first - interoperability also means IG can pull your friend graph from Snapchat, so it can also make it easier for big companies to smother smaller ones that are getting momentum based on their own social graph growth due to their USP. I don’t think this is insurmountable, just something to be careful of when implementing.)

matt_kantor an hour ago | parent | prev | next [-]

> Before Spotify became popular, people would integrate Last.FM into their media players

I still scrobble to Last.fm from Spotify (and other media players). I rarely use it for discovery anymore, but it's occasionally interesting to look at my historical listening trends.

JKCalhoun 2 hours ago | parent | prev | next [-]

If the default algo/behavior is allowed to persist, it's going to be effectively no real change.

Drop the algorithm altogether? I subscribe to channels for a reason.

saadn92 3 hours ago | parent | prev | next [-]

Third-party recommendation algorithms would be interesting, but I think they'd only address one layer of the addictive design the verdict is actually about. Autoplay, infinite scroll, notification timing, the variable reward patterns from likes and comments -- those are all independent of which algorithm picks the next video. You could swap in the most wholesome recommendation engine imaginable and a kid is still gonna sit there for hours if the UI is designed around endless content with no natural stopping points.

theptip 2 hours ago | parent [-]

I dunno, careful what you ban; TV has “infinite scroll” too.

foobiekr an hour ago | parent | prev | next [-]

99.9% of these would just be malicious spyware that people are tricked into agreeing to.

sophacles 44 minutes ago | parent [-]

So better than the 99.99% status quo?

heyitsaamir 3 hours ago | parent | prev | next [-]

Bluesky does this. In fact, the For You algorithm is a community built algorithm and way more popular than the native Discover algo.

data-ottawa 3 hours ago | parent | prev | next [-]

How do you prevent a Cambridge Analytica exfiltration situation with third party algorithms?

And how does this prevent addictive algorithms which will win through social selection?

ceejayoz 3 hours ago | parent [-]

The Cambridge Analytica stuff never got fixed, it just got hidden out of sight. The situation is worse than ever now.

Zigurd 2 hours ago | parent | prev | next [-]

That's called a "feed generator" on Bluesky.

basisword 2 hours ago | parent | prev | next [-]

The real solution is going back to a chronological feed of people you actively choose to follow.

mindcrime an hour ago | parent [-]

At the very least, that should certainly be an option that users can select. And when the user selects a feed algo, it should stay fucking set until that same user actively chooses to change it.

kouru225 2 hours ago | parent | prev | next [-]

Yes please. Algorithms should be plug-in-and-play and not endemic to the app. You should be able to take popular algorithms and plug them into any app

ceejayoz 2 hours ago | parent [-]

That's just laundering the bad actions though a third-party.

The winning third party algorithm will be the one that gives people the same rush the first party algorithms currently do, because people will use it for the same reasons; they get to see cute AI animals do crazy things forever.

outime 3 hours ago | parent | prev | next [-]

Virtually nobody would choose to pay a subscription for the non-addictive app version, and I'd even say this suggestion is a bit insulting to anyone who isn't high-income.

bitwank 2 hours ago | parent [-]

I will never pay a subscription for the current clickbaity slop. I might if the algorithm were better, closer to YouTube of 10 years ago, when it would suggest lectures, artfully done film shorts, and overall more interesting, high quality content.

dmbche 3 hours ago | parent | prev [-]

Or algorithms have to be submitted and approved by a government body before being allowed to be implemented and are frequently audited

PokemonNoGo 2 hours ago | parent [-]

I guess this is the only way. I don't think we need novel approach and I don't consider this a novel one since we already have government agencies verifying approved processes in other areas so why not content distrubution.

onlyrealcuzzo 4 hours ago | parent | prev | next [-]

How is any app/website that 1) appeals to kids, 2) sells attention, 3) does A/B testing and/or has a self-learning distribution algorithm NOT guilty of this?

guzfip 4 hours ago | parent | next [-]

It probably helps when you suppress research that shows you’re harming children and allow human traffickers to fester on your platform with 17 warnings or whatever.

which an hour ago | parent [-]

The argument that research was suppressed and this is somehow damning is absurd on its face. The most obvious reason being that they obviously didn't do a very good job of suppressing it given that we hear this claim every day. The second being that they could have just not done this research at all and then there would have been nothing to "suppress" (this terminology is also very odd... if 3M analyzes different sticky notes and concludes that their competitors sticky notes are better than theirs but does not release the results, is that suppression?). The third is that studies with the same results have come out probably every year since 2010 and have been routinely cited in the mainstream press. Lastly, it ignores that many platforms have actually responded to research about potential harms of social media by implementing safeguards on teen accounts.

Look at the plaintiff in this case: it's a mentally unstable person who blames her life problems on social media. Never mind the fact that she had been diagnosed with mental illnesses as an early teen, or that an overwhelming majority of people who use social media don't develop eating disorders or other mental illnesses as a result of it (and in fact the incidence of say bulimia peaked 30 years ago in spite of almost universal social media adoption among young people). This is not at all like smoking where 15% of smokers will get lung cancer.

And due to some absurd legal reasoning the plaintiff was allowed to pseudonymously extort $3 million out of tech companies. Worst of all I see people on a technology forum applauding this out of some sort of resentment towards large companies!

text0404 21 minutes ago | parent | next [-]

Nobody ever accused these companies of being competent at suppressing the research (which includes third parties btw, not just internal).

Companies do this research for all sorts of reasons (including legal compliance, demonstrating due diligence to regulators, to understand users and improve products, etc etc etc). For example, it's not like Zuck commissioned an internal study to show how they're harming children, more like some internal team was seeking to understand why kids love a certain feature which led them to conclusions that make the company look bad.

To your third point, that research is usually leaked by whistleblowers or conducted by third parties, not because of the altruism of these companies.

Finally, the platforms aren't doing enough and with this court case, it seems like they've persisted in finding ways to hook children because of financial incentives.

The sources cited in this article are a good primer for understanding what these companies are doing: https://www.transparencycoalition.ai/news/meta-suppressed-re...

danny_codes 8 minutes ago | parent | prev | next [-]

The jury disagrees with you.

jf22 34 minutes ago | parent | prev [-]

The "overwhelming majority" standard for harm seems odd when you use 15% of smokers getting harmed as an example. 15% is not an overwhelming majority.

KaiserPro 3 hours ago | parent | prev | next [-]

I think there is a fourth portion that is probably more important:

Actively ignoring harm caused by your product. TV/radio has sold attention, but there were pretty strict rules on what you can/can't broadcast, and to whom. (ignoring cable for the moment) Its the same for services, things that knowingly encourage damaging behaviours are liable for prosecution.

Longlius 30 minutes ago | parent [-]

Except cable is the more apt comparison here - broadcast rules exist because airwaves are an extremely finite resource and so we can argue that the government has a vested interest in what kind of speech can happen on them. No such scarcity exists with web services.

SquibblesRedux 30 minutes ago | parent | prev | next [-]

I would argue that no app/website should be selling itself to kids. No corporation should be trying to tether its ARR to children's attention.

xahrepap 6 minutes ago | parent [-]

When my kids were young, we canceled our Disney Channel / etc cable subscription and showed them more PBS and similar.

It was really annoying turning on a show for 30 minutes then for the next week hearing about that new toy they just have to get. It was exhausting.

sampullman 4 hours ago | parent | prev | next [-]

I think there's a little more nuance than that, but it seems roughly correct.

Wouldn't it be better if apps/websites targeting kids didn't use A/B testing to be more addictive?

KaiserPro 3 hours ago | parent | next [-]

I think addiction is a redherring.

Pokemon is addictive, computer games are addictive. Its whether they are knowingly causing harm, and or avoiding attempts to stop that harm.

Zigurd 2 hours ago | parent [-]

Addictive patterns in games and other online activity is a bit less innocent than you are portraying it: knowingly causing harm is too low a standard. A lot of the profitability of online games, prediction markets, etc. comes from the whales. The whales are probably addicted. If your business is a whale hunt you are possibly causing harm at least to the extent that addiction is dangerous.

schmidtleonard 3 hours ago | parent | prev | next [-]

> more nuance

Not enough to diffuse liability. 15 years ago when recommender algorithms were the new hotness, I saw every single group of students introduced to the idea immediately grasp the implication that the endgame would involve pandering to base instincts. If someone didn't understand this, it's because

> It is difficult to get a man to understand something, when his salary depends on his not understanding it. - Upton Sinclair

ramon156 3 hours ago | parent | prev | next [-]

They'd find another method. Why are we allowing this in the first place?

I don't have an answer to fix this whole mess, but it starts with our attitude towards addiction. We've built a system that rewards addiction in all sorts of places. Granted, every addiction is different, and I'm of the opinion that it's not (drug = bad), it's how you use it and react to it. We can control the latter, but we choose to ignore it because we're too busy with anything else. This is a tale as old as time...

greenhearth an hour ago | parent | next [-]

"Free market" and "entrepreneur spirit" fetishism and fear of collective social action against individual drives.

aaomidi 3 hours ago | parent | prev | next [-]

In the span of how long it takes for law to catch up to what’s going on, YouTube and Facebook has been around for a tiny amount of time.

bluefirebrand 3 hours ago | parent [-]

They have been around long enough to have done unknowable damage to entire generations of humans

aaomidi 3 hours ago | parent [-]

As usual unfortunately laws are reactive.

ToucanLoucan 2 hours ago | parent | prev [-]

> Why are we allowing this in the first place?

Exactly what I keep coming back to.

For me, it feels like you could cut this problem down substantially by eliminating section 230 protection on any algorithmically elevated content. Everywhere. Full stop.

If you write or have an algorithm created that pushes content to users, in ANY fashion, that is endorsement. You want that content to be seen, for whatever odd reason, and if it's harmful to your users, you should be held responsible for it. It's one thing if some random asshole messages me on Telegram trying to scam me; there's little Telegram can do (though a fucking "do not permit messages from people not in my contacts" setting would be nice) but there is nothing at all that "makes" Facebook shovel AI bullshit at people, apart from it juices engagement, either by genuine engagement or ironic/ragebaiting.

And AI bullshit is just annoying, I've seen "Facebook help" groups that are clearly just trawling to get people's account info, I've seen scam pages and products, all kinds of shit, and either it pisses people off so Facebook passes it around, or they give Facebook money and Facebook shoves it into the feeds of everyone they can.

It's fucking disgusting and there's no reason to permit it.

pjc50 an hour ago | parent | next [-]

> If you write or have an algorithm created that pushes content to users, in ANY fashion, that is endorsement

Yes. People make free speech arguments about this, but the list and order of stuff returned by algorithmic non-directed (+) lists is clearly a form of endorsement. Even more so is advertising, which undergoes a bidding process. Pages which show ads should be liable if those ads are fraudulent, especially if they're so obviously fraudulent that casual readers suspect them immediately.

(+) Returning a list of stuff in a user-specified query, on the other hand, is not endorsement. Chronological or alphabetical order or distance-based or even random is fine.

Note that section 230 is, of course, US specific and other countries manage without it.

SpicyLemonZest 2 hours ago | parent | prev [-]

Eliminating section 230 protections would heavily disfavor any kind of intellectually stimulating content, because it's hard for a platform to scalably verify that nobody's making defamatory claims. But pointless clickbait, heavily filtered Instagram models, etc. don't really have liability concerns on a video-by-video level. To me it seems like this makes the problem worse.

roughly an hour ago | parent [-]

It’s not eliminating section 230 entirely, it’s eliminating it for algorithmically promoted content. If you have a site that has user content and you present that content in a neutral fashion, section 230 applies. If you pick and choose what content to present to users (manually or by algorithm), you’re no longer a neutral platform, and shouldn’t be getting the benefit of 230.

SpicyLemonZest 17 minutes ago | parent [-]

I understand that. My point is that this would mean algorithmic feeds can only contain vapid, pointless content with no liability concerns. To me, it doesn't improve the world to require that Instagram and Youtube exclusively serve slop, even if that might cause some number of people to abandon them for non-algorithmic platforms with better content.

steve-atx-7600 3 hours ago | parent | prev [-]

For context, facebook is so dystopian when I login once every few years that I’m not sure I’ll ever use it again. And, I hate wading through the YouTube cesspool to find some educational content I like. But, I don’t think it makes sense to ban a/b testing or optimization in general. Some company could use it, for example, to figure out how to teach math to kids in a way that’s as engaging as possible. This would be “more addictive” technically.

sampullman 3 hours ago | parent [-]

That's a good point, I'm not 100% sure it's worth throwing away the potentially beneficial uses. There might not be a solution that's both feasible to implement and avoids banning useful things. In the end I usually come back to it being the parent's responsibility to monitor usage, limit screen time, etc., but it hasn't been working so well in practice.

parpfish 3 hours ago | parent | prev | next [-]

A/B testing is one way to make things “addictive” but you can also make addictive products without it.

A really good designer could make a highly engaging app or an editor can write clickbait headlines all with without testing.

esafak 3 hours ago | parent [-]

These products maximize revenue through engagement with advertisements. The outcome is built into their business model.

systemsweird 34 minutes ago | parent | prev | next [-]

Probably not much other than scale. Facebook is large enough that they can hire behavioral researchers to make this stuff more addicting while looking the other way and raking in the money. I think Roblox is just as bad (maybe worse) regarding addiction for kids. I’ve played hundreds of hours with my sister’s kids and the way all these low quality slop games handle grinding, progression, and pay gating is honestly disgusting.

But then again, I manage to get myself addicted to a video game usually once a winter for a few weeks, and don’t play games for the rest of the year. There’s really no solution to this, but I don’t want to live in a world where everyone is hopelessly addicted to shallow digital experiences.

steve-atx-7600 4 hours ago | parent | prev | next [-]

How’s this different than tv that a kid might see that has ads and programming targeting kids?

I watched 80s horror movies when I was in elementary school and had nightmares for years. Should I sue now?

How about parents be held responsible for how they care for their kids or not? Maybe a culture that judged parents more strongly for how they let their kids spend their time would be an improvement.

everdrive 4 hours ago | parent | next [-]

Being able to find some basis for comparison between two things does not render them equivalent, and this is an extremely frequent fallacy I see with regard to technology discussion on HN.

parpfish 3 hours ago | parent | next [-]

When it comes down to it, I’m not sure how you differentiate an “addictive” product from a well-made product that I choose to keep using.

When people say that Tetris and Civilization are “addictive” they aren’t implying anything malicious about the development, it’s more of a compliment about the game (and maybe a little lament about staying up too late).

But the addictive nature of social media feels different and I can’t figure out what that distinction is.

someguynamedq an hour ago | parent | next [-]

Tetris and civilization are also harmfully addictive, but the scope of the behavior they can hijack is lower. "One more turn" at 2am is harmful. Just not as harmful as something that knows about and interacts with every aspect of your social life and your view of the real world around you like social/media apps do today.

A really well built hammer doesn't make you want to spend all your time using a hammer, it's just good when you need a hammer. That's a well-made product that you choose to keep using.

card_zero 3 hours ago | parent | prev | next [-]

People will now say "the algorithm" and "dopamine", explaining nothing. You see, social media is truly addictive because it's been honed to be addictive in some way that isn't specified or known or actually true.

OK, let me try to analyze it:

1. Humans are idiots.

2. We have idiot glitches where we obsess over some particular thing. This is our own business and our own fault, and is impossible to tease apart from just liking stuff a lot and benefitting from it.

3. These glitches tend to accumulate in certain areas, and then some companies find themselves in the position of profiting from human glitchy idiocy, even though they didn't want to be behaving like scammers.

4. Then some of them get cynical about it and focus on that market segment, the obsessed idiots. This can include gambling and social media.

genthree 2 hours ago | parent | prev | next [-]

I have an instagram account because it's by far the best way I know of to keep up with various small businesses, local or otherwise, that I like.

What I go into the app to do: see if there are any updates from those businesses.

What the app presents me on launch: a bunch of nonsense selected for what will best-distract me. And you know what? Sometimes it does catch my attention for a minute or two!

What the app doesn't let me do: disable the nonsense, or even default to the tab of accounts I'm following. Hell they even intentionally broke ways to achieve this with iOS' scripting, you'd think that'd be niche-enough they wouldn't care, but apparently enough people were doing it that they bothered to break it.

The algo feed is addictive on-purpose. I would turn it off if I could, and there's a damn good reason they don't let you do that. I "choose" to engage with it sometimes, which sometimes gets people coming out to go "oh-ho! So your revealed preference is that you like the feed!" but that's plainly silly, as that's highly contextual and my in-fact actual preference would be to never see that feed again in my life, and in fact I've spent a little time trying to make that happen. It's only my "revealed preference" in a world where I've had to compromise by occasionally losing a couple minutes to this crap because the app won't let me go straight to what I actually want. That's my true preference, the "revealed" one is only ever briefly flirted-with in a context in which I'm prevented from attaining my actual preference.

Consider a person who struggles with eating junk food. They don't keep junk food at home, in fact. That is their preference, to not keep it around, because they don't want to eat it and know they will if it's there. Now concoct some scenario in which, in exchange for something else they want, they have to take delivery of a couple bags of potato chips and a box of cookies every week. And sometimes, they eat some of that before tossing it out or giving it away! "Ah-ha, so their revealed preference is that they want junk food!" Like, no, of course not.

There's a reason these apps have to prevent you from using any part of them except with the presentation they like: because they'd being addictive on purpose, and tons of users do not want the addictive parts, at all, but do want other parts.

prewett 3 hours ago | parent | prev | next [-]

Not to disagree with you, but in the case of Civilization, I do find it addicting in both senses. It is one of two games that I just cannot play, because I will be up until 3am playing. (Puzzles and Dragons was the other one, I think I had to uninstall it the day after I downloaded it)

pixl97 2 hours ago | parent [-]

Oh, not Factorio. I guess Factorio might be slightly less addictive than crack because I was eventually able to put it down.

everdrive 3 hours ago | parent | prev | next [-]

I think this represents a strong misunderstanding of what addiction is, and how it works. I mean this respectfully, and not combatively -- I expect you have never had problems with addiction.

When it comes to behavioral psychology research, there is a strong understanding of concepts such as behavioral reward schedules; interval-based rewards, time-based rewards, variably-interval-based rewards. People have a very clear understanding of what sort of stimulus is and is not prone to addiction. You can get a mouse in a cage to become hopelessly addicted to pressing a lever for a reward depending on what reward schedule you use, and this does not translate to a mouse who can just get the reward at a regular interval. (or perhaps merely a less-addicting interval) The mouse in the cage pressing a button set to a variable-ratio reward is equivalent to an old person using a slot machine in a very literal and direct way. This also translates to social media with permanent scrolling. So many of the stories such, but the variable interval is the extremely enticing (or enraging) story that just might be the next one.

close04 3 hours ago | parent | prev [-]

> Tetris and Civilization are “addictive” they aren’t implying anything malicious about the development, it’s more of a compliment about the game

Because it's a figure of speech, not a clinical diagnosis. Literal and figurative addictions are different beasts.

Intent, premeditation, scale are major differentiators. When they know they will cause harm, they concentrate and fine tune it for the effect, turn it into a firehose, and target it at specific individuals it's very, very different from what random ads, games, of movies do. These companies literally designed their products with the intent to make them addictive and target children, knowing the full implications and ignoring the harm they caused.

You're comparing a drug dealer who only sells to kids to a store clerk who also sells icecream to kids. It doesn't take more that scratching the surface to realize the similarity is very fleeting.

steve-atx-7600 3 hours ago | parent | prev | next [-]

I understand what you’re saying, I personally don’t like or use social media, but I don’t agree that these companies are at fault after reading this article and others. I’d rather be wrong and learn something than think I’m right, so I welcome further criticism.

everdrive 3 hours ago | parent | next [-]

I agree with you that parents need to ultimately be responsible for keeping their kids off social media. I think there are a few problems here:

- Social media is still somewhat new, and the broader public is only now discovering that it's a clear net negative both personally and for society. Because this is such a new realization, I think a LOT of people have not really figured out how this problem should be dealt with. (both personally, via social norms, but also with regard to laws and regulations.

- No matter how awesome of a parent you are, 100% of your kids friends will have social media and they will introduce it to you kid. That may do less harm than if they have it themselves, but some harm will still be done.

- There are network effects to consider. It's true that it's your personal fault if you use cocaine -- however we also understand that cocaine is so addictive that it really cannot be used safely. Social media is metaphorically the same. It's a personal failing if you're a social media addict, however broadly almost everyone is susceptible to it. In my mind, that is an argument for regulation.

Now that said, I have zero faith that our government can actually build sensible regulation here.

F7F7F7 3 hours ago | parent | prev [-]

They strategically use patterns that directly trigger the release of dopamine into the brain.

They've created algorithms that use slot machine like experiences that keep kids hooked to the screen.

These algorithms feeds users barely moderated content that feeds their worst instincts. With almost surgical precision when wanting to illicit engagement.

Then when research shows them the harm their causing they bury it, hire lobbyist, and double down.

Switch out a few words up there and you have the big tobacco playbook.

nxor2 an hour ago | parent [-]

It's not just kids. My parents have spiraled in this way too. Why interact with each other when reels are more exciting? Why pursue friendships if you can experience it parasocially? This has been incredibly depressing, and it's a reason I make sure to value the people in my life. I have a lot of disgust for Meta and Google seeing what they've done to society broadly. All for money

card_zero 3 hours ago | parent | prev [-]

Right, like social media and addictive drugs for instance.

roxolotl 3 hours ago | parent | prev | next [-]

Both things can be true. Parents can share responsibility. But it is also the case that Facebook actively suppressed research that showed that children using their platforms experience emotional harms. It is also the case that around the time you were in elementary school discussions about children’s programming had been ongoing for years and eventually regulations were put in place[0].

0: https://en.wikipedia.org/wiki/Regulations_on_children's_tele...

steve-atx-7600 3 hours ago | parent [-]

I can agree that I think they acted to harm society knowingly. I used to think regulation could help and maybe it can, but if there were some way to shape the culture to value, for example, educational tv programming, I think that would be the most powerful influence on tech/media companies. Regulation could serve to inform parents “this programming/platform is known to rot your kids mind” like a nutrition label and some day hopefully parents will be more likely to disallow it like some do knowing how much sugar is in sodas.

ceejayoz 3 hours ago | parent | prev | next [-]

> How’s this different than tv that a kid might see that has ads and programming targeting kids?

Those ads didn't adjust themselves on a per-child basis to their exact interests.

kspacewalk2 2 hours ago | parent | prev | next [-]

Parents ought to be held held responsible for how they care for their kids. This isn't just true of their use of social media and devices, but also when it comes to teaching them to look both ways when crossing the street; making sure they understand the concept of private parts, consent and personal space; making them understand the dangers of alcohol, and many other things.

Does any of that obviate the need for safe urban design, anti-CSAM and anti-molestation laws, or laws prohibiting the local dive from serving a cold one to my 11 year old? Will simple appeals for "parental responsibility" suffice as an argument for undoing those child safety systems we put in place, or will they be met with derisive dismissal? Why should your "solution" be treated any differently? In fact you offer none. Yours is the non-solution solution, the not-my-problem solution, the go-away solution. Not good enough on its own, sorry.

ipython 25 minutes ago | parent | next [-]

As sibling comments point out, parents are already overly held responsible for how they care for their kids. To an absurd amount.

I have had CPS called on me by an overbearing school administrator. Have you had that happen to you? Let me tell you, it's not a fun experience.

Enough of this "blame the parents" mentality! Ironic given that the goal for all these platforms is growth at all costs. Where do you think "growth" comes from, after all? If you make being a parent so goddamn difficult that it's more rational to just not do it, guess what, poof goes your sweet, sweet growth.

So tired of this line of thinking. The parents are put into an impossible situation. Stuck between kids who by definition and by design will test the boundaries that they're given, and tech platforms that are propped up with not just trillions of dollars of valuation, but the societal expectation that you engage with them. Want your kids to compete in sports? Well, they need to have WhatsApp and Instagram to keep track of team events!

Give me a break. Equating controlling social media and devices to "look both ways when crossing the street" is disingenuous at best. There are no companies that make billions of dollars in advertising revenue telling your kids to jaywalk. But Facebook gladly weaponizes their algorithm to drive "engagement" - and, surprise, children with still-forming prefrontal cortices are drawn to content that reinforce their natural self-criticisms and doubts. So now my child, who has to be on Instagram to keep track of sports schedules, is also force fed toxic content because that's what a mechanical algorithm thinks is most "engaging" based on my derived psychological and demographic profile.

You want to talk about CSAM? X proudly proclaims that they have every right to produce deep-fake pornography with the faces of underage children. What action shall I, as an individual parent, take if my 15 year old girl's face is suddenly pasted onto sexually explicit video and widely shared thanks to xAI's actions? Shall I be held responsible for how I "let this happen" to my child?

kspacewalk2 21 minutes ago | parent [-]

You seem to imply in your reply that I disagree with you, hence necessitating a polemic style. I would have thought the last few sentences of my comment make it clear where I stand on simplistic appeals to "parental responsibility".

criddell an hour ago | parent | prev | next [-]

> Parents ought to be held held responsible for how they care for their kids.

If YouTube detects that a child is watching 5 hours of video a day, should Google alert child protective services?

kspacewalk2 an hour ago | parent [-]

Why don't we start with a mechanism for user registration that does not involve a simple pinky-swear "over 13?" checkbox and then continue the conversation about further steps.

zer00eyz an hour ago | parent | prev [-]

For 30 (60's to 90's) years we told parents "It's 10pm do you know where your kids are", with an AD, on TV. We came home to empty houses and go in with a key around our neck.

Now, we call the police, and arrest parents, if kids are outside, unsupervised. https://www.cnn.com/2024/12/22/us/mother-arrested-missing-so...

When I was a child in the 80s and 90s, we had "jobs" as kids... Mowing lawns, Paper routes and so on. Now if you go offer to mow your neighbors lawn, the cops get called: https://www.fox8live.com/2023/07/26/officer-surprises-young-...

Parents are afraid to let their kids out of their site, and for those of us who have been pragmatic because we understand the data (and not the fear) they tend to look down on us.

Talk to any one who is Gen X and they will tell you that we basically got thrown out side all day (and had fun). Parents cant say "go outside and play" so kids end up getting handed devices... and they are going to play and explore and do the dumb things that gets them in trouble.

> those child safety systems we put in place

Except we have denormalized things that SHOULD be perfectly fine. And as fewer kids get to go outside unattended with friends, it pushes their peers to go "online" to socialize.

Maybe the government needs to run commercials "Its 10am, why isnt your child outside playing with the neighbor kids unsupervised"

mrweasel 2 hours ago | parent | prev | next [-]

> How’s this different than tv that a kid might see that has ads and programming targeting kids?

It's not, that illegal as well. You cannot target kids with TV advertising.

jeffbee 4 hours ago | parent | prev [-]

The difference is largely in the way that the legal caste perceives themselves to be aligned with media but opposed to tech.

everdrive 4 hours ago | parent | prev | next [-]

Correct, selling attention inevitably leads to harm.

wffurr 3 hours ago | parent [-]

As a parent, the only solution is sticking to ad-free subscription services. PBS is a godsend here, but there's other good options out there too. Tragic that the public broadcasting funding was cut when there's clear harms in the free* commercial options.

*Except for your time and mental health of course

everdrive 3 hours ago | parent | next [-]

Agreed. Libraries have books and DVDs, and you have things like the classical stations. You also have playgrounds and walks in the park, etc. (I'm also a parent of two young children.

Always doing wholesome stuff with your kids is certainly not easy or trivial, but there is a cascading effect here. If your child does not expect to be able to just watch TV all the time it's easier to keep them interested in other things. Once that expectation is burned in you'll be fighting it for a while. And once that expectation is burned in, a small child will _never_ say "I've had enough youtube, I don't need any more."

So I really don't want to be self-righteous about always doing wholesome stuff with your kids (we definitely do not succeed 100% of the time) -- but rather point out that letting them use addictive media has negative, cascading consequences that actually do make it harder for you as a parent. It's analogous to drinking to relax. You get relief now, and pay for it later. Not actually a good tradeoff much of the time.

nxor2 an hour ago | parent [-]

The unfortunate reality is, the internet has more up to date info than books, dvd's, pbs, and even 'the classical channel.' I play the piano and have found immense amounts of rare but nice music online, and only online. I completely agree that the media is bad, just pointing out that it's necessary to a degree if learning is your aim.

criddell an hour ago | parent | prev [-]

PBS is great if you are looking for a workable harm reduction strategy. Eliminating that type of entertainment is probably an even better goal.

embedding-shape 4 hours ago | parent | prev | next [-]

I guess ultimately it depends on if the app/website authors do so "negligently" or not.

> Jurors were charged with determining whether the companies acted negligently in designing their products and failed to warn her of the dangers.

So if you do so while providing warnings and controls for people, that might make it OK in the eyes of the law?

SecretDreams 3 hours ago | parent | prev | next [-]

Because most are just no where near as good and effective at ruining a kid's mind as meta. If others were as good as meta at destroying whole generations of cognitive development, they'd probably also be liable.

SirFatty 4 hours ago | parent | prev | next [-]

algorithm would be the key word I think.

guywithahat an hour ago | parent | prev | next [-]

It sounds like an adult was awarded $6 million because she watched a lot of youtube/instagram as a kid. Literally any social media site would be guilty of this; I hate to say it but we need better corporate protections if cases like this are allowed to enter court.

At least legal experts are critical of the decision: '“I don’t think it should have ever gotten to a jury trial,” said Erwin Chemerinsky, dean of the UC Berkeley School of Law'

DavidMcLaughlin 3 hours ago | parent | prev [-]

A/B testing is very, very different to handing over control of your content to a reward function that optimizes for time spent over any other criteria.

We had 10 years+ plus of having products like Facebook, Twitter, YouTube, hell even LinkedIn with a basic content model of "you build your own graph of people who you pull content from" and their job was to show it to you and puts ads in there to fund the whole enterprise. If I decided to follow harmful content? That was a pact between me and the content creator, and YouTube was nothing more than a pipe the content flowed through. They were able to build multi-billion dollar businesses off of this. That's really important, this was enormously profitable. But then the problem happened that people's graphs weren't interesting enough, and sometimes they'd go on the thing and there were no new posts from people they followed, and this was leaving money on the table. So they took care of that problem by handing over control of the feed to the reward function.

More accurately, especially for Meta products: they completely took control away from you. You didn't even have the option to retain the old, chronological social graph feed anymore. And it was ludicrously profitable. So now the laws of capitalism dictate that everyone else has to follow suit. I now have extensions on my browser for Instagram and YouTube to disable content from anything I don't follow - because I still find these apps useful for that one original purpose they had when they blew up and became mainstream. Why are these browser extensions? Why can't I choose to not see this stuff in their apps? That's the major regulation hole that led to this lawsuit, imo.

It's the same thing you see with people blaming smartphones for brainrot. We've had 15 to 20 years of smartphones with more or less the same capabilities as they have today and for the vast majority of that time my phone didn't make books less interesting or make me struggle to do chores or manage my time. For a full decade or more I saw my phone as a net positive in my life, was proud to work for Twitter and generally saw technology like the Louis CK bit about the miracle of using a smartphone connected to WiFI on an airplane. But in the last five years or so, things have noticeably and increasingly gone to shit. Brainrot is a thing. All my real life friends who are the opposite of terminally online or technical are talking about it. I don't use TikTok but it seems like that is absolutely annihilating attention spans. The topic of conversation over drinks is how we've collectively self-diagnosed with ADHD and struggle with all kinds of executive function.. but also are old enough to remember a time when none of this existed. Complete normies are reading Dopamine Nation and listening to Andrew Huberman trying to free themselves.

I don't know what the exact solution is, but there's at least a simpler time we can point to when we all had smartphones and we were all connected via platforms and we all posted and consumed stupid pictures of each other and it wasn't.... _this_.

onlyrealcuzzo 3 hours ago | parent | next [-]

Great point RE the self-learning algorithms. That's what I intended originally, but didn't communicate clearly.

thin_carapace 3 hours ago | parent | prev [-]

regarding brain rot, short form content is absolutely going to be the root physical cause - people could tolerate smartphones prior to the inception of short form content. on a cultural level, this level of destruction could be compared to the effects of a coordinated and targeted attack from enemy nation states - if not for the fact that we did this to ourselves in the name of profit. one can only hope that the old guard wakes up to systematically handle this issue that we have no familiarity with, otherwise our system will buckle under the pressure of 10-20 years worth of nonfunctional humans. i do find a technocratic dystopia far more likely, considering the aforementioned mentally castrated opposition ... hows a generation of kids going to win against trillions of dollars of zuckerberg 'engineering' steering them since birth? shame on the 'engineers' who engendered this mess, shame on their shepherd 'managers', and shame on the sociopaths at the top.

jimmyjazz14 44 minutes ago | parent | prev | next [-]

Coming from someone who hate social media (and has kids) this might seems like a good thing on the surface, but I worry it will be another case used to allow the government to limit speech on the internet for adults.

ferguess_k 9 minutes ago | parent | prev | next [-]

What a surprise! I guess they didn't pay enough protection money. Still, better than nothing.

freshtake 2 hours ago | parent | prev | next [-]

Short form video is a different beast altogether, and much more concerning. The fact that these platforms don't offer a way to avoid short form altogether is a big issue.

YouTube allows you to "show fewer shorts" but what if you don't want them popping up at all?

AI Slop is the best thing to happen to these platforms - because it will lower trust and engagement as people (hopefully) become tired of inauthenticity. Rage bait is potent when the event in the video _actually_ happened, but when you realize it was AI generated, the manipulation feels even more obvious (though it was always there).

These platforms should also allow users to understand how the algorithm has categorized them, and be able to configure it. YouTube, Instagram, et al. would be safer places for viewers if they allowed users to tell them what they want to be exposed to, and what they don't. Big tech is dodgy about this currently, because the more control the user has the lower the engagement (good for the user, bad for profit).

malfist an hour ago | parent [-]

That "show fewer shorts" button doesn't do a damn thing. I click it, refresh the page and whala, shorts.

cphoover an hour ago | parent [-]

Previously I made a chrome extension that removes them from web... But I haven't updated it in a while. Basically just inspects the HTML/CSS patterns of the shorts components and removes them from the page. You could probably code/vibe code a similar extension in 10m.

absoluteunit1 3 hours ago | parent | prev | next [-]

Oh man if they think YouTube and Instagram are addicting they should see what Roblox does lol

adrr 9 minutes ago | parent | next [-]

There's also Prodigy which schools push on kids to practice math has the same thing including pay to win mechanics.

embedding-shape an hour ago | parent | prev [-]

As someone who maybe fired about Roblox once like three years ago, what does Roblox do that is way more addicting than YouTube and Instagram, and also I guess they're ignoring reports showing the harm even more than YouTube and Instagram, if I understand you correctly?

paulkon 3 hours ago | parent | prev | next [-]

Just needs a health warning label, like on alcohol or cigarettes. Then onto the high sugar products, and a quarter of the grocery store

ddoolin 3 hours ago | parent | next [-]

If we want to compare it to alcohol/cigarettes, then kids shouldn't be allowed to use this either.

pearlsontheroad 3 hours ago | parent [-]

and the government should tax it accordingly

ibejoeb an hour ago | parent [-]

I don't think that you can practically expect to tax speech.

jf22 34 minutes ago | parent [-]

You can tax reach though.

alexlesuper 3 hours ago | parent | prev [-]

We have health warnings for food that contains lots of sugar, fat and/or sodium in Canada

robinanil 3 hours ago | parent | prev | next [-]

I have a somewhat unusual vantage point on this.

I'm a former Google engineer, now running a children's mental health startup (Emora Health), and my toddler is already on YouTube Kids.

So this verdict hits on every axis for me.I wrote up my full take here [1], but the short version: I don't think the "Big Tobacco moment" framing that NYT is pushing actually holds up.

Litigation is negative reinforcement, and if you've ever tried telling a toddler "no" you know how well that works long-term.The families in this case absolutely deserve to be heard. The harm is real. But courts can only punish — they can't redesign a recommendation algorithm.

The change has to come from people who understand these systems building better ones.

Haidt has been saying for years what this verdict just confirmed. The evidence was never the bottleneck. The will to design differently was.

I will give you a simple experiment. Try blocking Blippi from YouTube Kids, man, it's crazy, even if you block the main Blippi and Moonbug channels. 100s of channels have Blippi content cross-posted. And it keeps popping up. I know it's easy to build a Blippi block feature using AI that blocks across channels.

Thats the kind of solutions we need. I know we have the tools. Just need intent and purpose

[1] https://www.emorahealth.com/clinical-insights/social-media-v...

acmecorps 10 minutes ago | parent [-]

Just a tangent, interesting that you brought up Blippi. Any issues that you have with Blippi if you don't mind me asking? :D

nottorp 3 hours ago | parent | prev | next [-]

Just kids? Not adults?

kevincloudsec an hour ago | parent | prev | next [-]

two verdicts in two days, $375m in new mexico and $6m in LA. meta's insurance company already got cleared of covering these claims. if even ten more states follow, meta is paying out of pocket at a scale that actually shows up on the balance sheet.

GardenLetter27 3 hours ago | parent | prev | next [-]

Mandatory age verification is coming.

_kidlike 3 hours ago | parent | next [-]

my thoughts exactly... this "verdict" came with very suspicious timing.

highstep 3 hours ago | parent | prev | next [-]

otherwise know as mandatory identification

2OEH8eoCRo0 2 hours ago | parent | prev [-]

Good. Long overdue

ramesh31 4 hours ago | parent | prev | next [-]

I've heard about "landmark" cases against these companies over and over again for the last decade. There seems to be at least one every couple of years. And yet literally nothing has ever happened or changed.

petcat 3 hours ago | parent | next [-]

Since these are civil lawsuits, it just takes more people coming forward to sue. There are plenty of cases where a jury found a defendant liable for damages only for the defendant to continue the bad behavior and subsequent juries awarding ever-increasing and compounding punitive damages. Big Tobacco and Purdue Pharma (went bankrupt) are examples of this pattern. Monsanto was famously hit hard with massive "repeater" damages after they continued selling and marketing Roundup despite prior judgements.

The exact same can happen to Big Tech. The goal is to get them to stop the bad behavior now.

mrbluecoat 3 hours ago | parent | prev [-]

I feel the same way. They're just going to appeal the case until they find a layer of the legal system where they have leverage.

kogasa240p an hour ago | parent | prev | next [-]

Great news but this will probably the catalyst for more "age verification" nonsense. These algorithms are bad for everyone, not just kids.

ChrisArchitect 2 hours ago | parent | prev | next [-]

[dupe] Discussion: https://news.ycombinator.com/item?id=47520505

homeonthemtn an hour ago | parent | prev | next [-]

Just give people the option to turn off algos. "I do not consent to suggested content"

yacin 4 hours ago | parent | prev | next [-]

this has to be the first of many right? fingers crossed this leads to some meaningful change.

jeffbee 4 hours ago | parent | next [-]

You mean it's the first of many appeals, I assume.

Trial courts will decide pretty much anything. Then the case gets appealed over whether the trial court correctly interpreted things you probably perceive as uncomplicated, like the 1st Amendment.

2OEH8eoCRo0 4 hours ago | parent | prev [-]

It's a huge deal because it was the bellwether case for over 1,000 other similar cases.

yacin 4 hours ago | parent [-]

ah yup:

> It comes on the heels of a Delaware court decision clearing Meta’s insurers of responsibility for damages incurred from “several thousand lawsuits regarding the harm its platforms allegedly cause children” — a ruling that could leave it and other tech titans on the hook for untold future millions.

trollbridge 4 hours ago | parent | next [-]

Yep. The insurance covers accidents and negligence, not deliberate decisions to impose harm to children for financial gain.

guzfip 4 hours ago | parent | prev | next [-]

Sounds too good to be true. I’ll hold my breath.

AlienRobot 4 hours ago | parent | prev [-]

I wonder at which point do children become such a liability for platforms that it's easier to just ban all children altogether.

Children don't have disposable income to buy ads/subscriptions. They don't have experience to write about. The only thing they have that adults don't is time which translates into engagement metrics.

In an ideal world, the adults that buy/manage the computers would create age-restricted account for children, and the OS would give this information to the browser, which would just transmit it via HTTP. This is the safest method to verify ages. If an operating system doesn't want to support this, it's ultimately the adult's responsibility to install one that supports it. This would mean there would be no burden on the adults (the majority of the planet) to verify their ages, so there would be no burden on the platforms to restrict ages either.

If platforms could verify ages without inconveniencing their main user base, I wonder if platforms would just start banning all minors, or if there is some reason to allow minors in the platform that justifies all the liability surrounding them.

WarmWash 2 hours ago | parent | next [-]

Children are an extremely valuable ad target.

They have their hands directly on their parents heart strings, and their parents have a credit card.

This isn't anything new, think about the toy ads we had on TV when we were young.

AlienRobot 2 hours ago | parent [-]

I guess you are right. I assumed that something like Youtube Kids would have no ads at all given the audience, but it seems it does have ads targeted at young children. Bleak world we live in.

germinalphrase 3 hours ago | parent | prev [-]

Nobody takes “age-restricted account[s] for children” seriously.

Parental controls and age-restrictions are almost universally half-baked, buggy fig leafs to displace negative attention from software and content providers.

topheroo 2 hours ago | parent | prev | next [-]

They were also designed to addict adults, just saying.

AnimalMuppet 2 hours ago | parent [-]

Right, but adults are assumed to be somewhat more responsible for themselves. This is why we don't let kids (legally) smoke or drink, but we do let adults do so. We expect that adults can, in general, say no, and that children are less able to do so.

But it's not absolute. Some drugs are illegal for adults as well, for example. Why? Because they're too addicting.

So are Instagram and Youtube just nicotine, or are they heroin?

bethekidyouwant an hour ago | parent | prev | next [-]

Let me disable short form video content. Jfc YouTube…

pautasso 3 hours ago | parent | prev | next [-]

Everyone now posting on social media about how the sentence "Social Media is Addictive" is going viral.

nlarion 3 hours ago | parent | prev | next [-]

There is no personal responsibility left in America. I have a child. It's my job to teach him and watch what he watches and does. I guess I am the only one who thinks this way. Good luck having the parental government raise your child. Parody: I let my child have cocaine and now they're addicted!!!!! Hilarious.

freshtake 2 hours ago | parent | next [-]

How old is your child? Younger than 6-8 it's easy to monitor what they're watching and enforce limits. By age 9-10 it isn't just about what they access in the home. Many schools in America are giving kids computer and tablet access, and kids are smart or curious enough to access social media there.

I agree that a big part of this is educating children about these hazards, but that also doesn't mean we should allow these companies to data science the shit out of our attention and will power. Many adults have concerning relationships with social media too -- exposure, pressure, and manipulation are key ingredients that are difficult for anyone to deal with.

nlarion 33 minutes ago | parent [-]

Yeah it's too bad there aren't any tools you can use to block any content at your home YOU personally deem irresponsible /s. Im not sure what your argument is here. If it's for regulation then please do some reading on regulatory capture before you hand over your ID card while logging in to respond to my comment

dj_gitmo 2 hours ago | parent | prev [-]

> Parody: I let my child have cocaine and now they're addicted!!!!! Hilarious.

Cocaine is illegal because it is addictive.

IncreasePosts an hour ago | parent | next [-]

LSD and hallucinogenic mushrooms aren't addictive and aren't legal. Cigarettes and alcohol are addictive and are legal.

nlarion 17 minutes ago | parent | prev [-]

Yet, I know many people who've done cocaine that are in other respects law abiding citizens. Making unjust laws makes us all criminals. The government cannot protect people from themselves, no one can. The best we can do is try to educate, and we can't even seem do that. Good luck out there buddy.

gervwyk 3 hours ago | parent | prev | next [-]

now do Candy Crush..

superkuh 3 hours ago | parent | prev | next [-]

It's amazing that a jury of people completely ignorant of what medical addiction is managed to make this discovery despite thousands of scientists around the world being unable to confirm this hypothesis. Which is to say: this is extreme bullshit which has nothing to do with reality or science or empirical study and instead is based entirely on feels and popular memes about "dopamine hits" (no basis in reality).

ceejayoz 3 hours ago | parent [-]

Parent poster has some… interesting views on addiction. Specifically, an extremely narrow view of what counts, that rules out things like gambling and porn addiction.

https://hn.algolia.com/?query=superkuh%20addiction&type=comm...

elteto 2 hours ago | parent [-]

You are watching astroturfing in real time.

baggy_trough 3 hours ago | parent | prev | next [-]

Doritos now liable for creating a good tasting chip? This is madness.

Ajedi32 2 hours ago | parent | next [-]

Yeah, people keep making the comparison to cigarettes but to me this is wildly different.

Cigarettes directly cause physical harm and even death. Social media can sometimes, under certain circumstances, depending on who exactly you're interacting with on social media, indirectly contribute to emotional harm.

Cigarettes are also physically addictive. Your body actually becomes dependent on them and will throw a fit if you try to stop using them. Social media is only "addictive" in the loose sense that all fun, mentally engaging activities are.

I'm not saying social media is fine for kids and we shouldn't do anything to reduce their use of it (TV and video games can be equally unhealthy IMO). I'm not even necessarily against legislation on the subject. But there's a huge difference between fining a company for breaking a law, and fining them for making a perfectly legal product "too fun" because you let your kids spend all their time on it and that turned out to be unhealthy.

This type of civil litigation where the courts effectively create and enforce ex post facto laws based on their opinion about whether perfectly reasonable, 100% legal actions indirectly contribute to bad outcomes is not a great aspect of our legal system IMO.

freshtake 2 hours ago | parent [-]

There are different kinds of addiction. The difference is physical vs. mental.

The best example of this is heroin, which has both a severe physical and mental addiction component, and it's the mental addiction that makes relapse so common.

Mental addictions rewire the brain's chemistry, causing the user to seek and only find joy in the substance. This is a better comparison for social media (albeit not as destructive and instantaneously harmful as narcotics)

Ajedi32 an hour ago | parent [-]

Everything you do or even just think about "rewires" your brain to some extent. The difference with addictive drugs is that they do so in a way that bypasses your brains' natural processes. The same cannot be said for "addiction" to games or social media, or other entertainment.

There can still be social ills associated with these forms of natural "addiction" (e.g. gambling), and I'm okay with regulating those ills, but I'm less okay with the courts doing so unilaterally based on their subjective opinions with no concrete law backing them up.

OptionOfT 3 hours ago | parent | prev | next [-]

One could argue that the ultra processed food industry is doing exactly what the tobacco industry did wrt to making their food addictive.

There is a difference in creating a food that tastes good vs creating a food that tastes good, but instantly wants you to eat the whole bag.

bknight1983 3 hours ago | parent | prev | next [-]

Normally I don't see people walking down the street staring at their Doritos

bogdanoff_2 3 hours ago | parent | prev [-]

addictiveness != enjoyment

Although to some extent they're correlated, sometimes the things that are most enjoyable you wouldn't describe as "addicting" and vice-versa.

Eating a nice full meal is more enjoyable than eating doritos on your couch, but you wouldn't describe it as addicting.

If anything, I find my experience of youtube today to be less enjoyable than in the past

techteach00 2 hours ago | parent | prev | next [-]

I gotta be honest. I saw the photo of the plaintiffs as the jury decision came back. They looked exactly like someone who just won the lottery. Philosophical or moral displays of victory look different.

I believe the plaintiffs solely care about becoming millionaires. No concern for how these rulings will further erode user privacy/rights online.

jjice 2 hours ago | parent | next [-]

Is that completely based on their expressions and reactions? I mean, you might be right, but I feel like an expression of reaction is too little to base such a damning statement on.

post-it 2 hours ago | parent | prev | next [-]

Body language analysis of strangers is bunk pseudoscience and a great way to reinforce your prejudices.

IncreasePosts an hour ago | parent | prev [-]

I thought the same thing. I took solace in the fact that it may be appealed, and that I suspect lawyers and taxes will take a large chunk out of the settlement

bknight1983 3 hours ago | parent | prev | next [-]

When you put something out there, there's a question of ownership for how people end up using it. - Some think that "if you use it incorrectly, it's your fault" and probably agree with the statement that Palantir is not an evil software and that one must "change the administration". - Some think that "if you use it incorrectly, it's the creator's fault" and then you have safety labels on everything (see Prop 65).

It's a spectrum of risk between the user and the creator. My opinion is that there's enough scientific evidence that social media to show that it has a negative impact on kids and teenagers as their brains are still developing. I think a social media ban on kids is a good thing (similar to a driver's license or age of drinking).

another-dave an hour ago | parent [-]

If you deliberately design your platform to be addicting then you can't say people who become addicted are "using it wrong" though.

Hobadee 3 hours ago | parent | prev [-]

Is the addictiveness of social media great? No. But the blame shouldn't be placed squarely on the companies either. What happened to personal responsibility? I was addicted to Facebook, I realized it, and I disconnected from it. I had withdrawals for a while (pulling out my phone and trying to open the app I had deleted without really thinking about what I was doing) but I quit. I know I am addicted to YouTube shorts, so I stay away from them. Occasionally I'll go on a bender and a few hours will slip by without me realizing, but while I know YouTube is designing them to be addictive, I blame myself for falling for it.

There are plenty of things in life that can be addicting; drugs, sex, money, power, adrenaline, entertainment, technology... The list goes on. If we remove everything addicting from life, you better believe something else will rise up to take its place.

The solution therefore isn't to remove everything addicting from life, but rather to raise everyone with the forethought to know what might be addictive, the self-awareness to realize when you are addicted to something, and the self-control (and support systems if and when necessary) to stop.

scottious 2 hours ago | parent | next [-]

Personal responsibility is important. But at the same time, we don't let people open up a heroin shop and then claim it's your personal responsibility to not buy it and use it. We don't put slot machines in schools but tell kids that they need self-control to not get addicted to gambling.

I don't know what the answer is, but it feels wrong to lean _entirely_ on personal responsibility. We live in a world in which we were simply not evolved to live in. People literally make a good living by engineering and exploiting our weaknesses for profit.

> raise everyone with the forethought to know what might be addictive, the self-awareness to realize when you are addicted to something, and the self-control (and support systems if and when necessary) to stop

If only it were that easy. If you've ever known somebody who struggles with a serious addiction you'll know that even when they know it's destroying their life they still can't stop.

superultra an hour ago | parent | prev | next [-]

I’m glad you went through that and came out ok.

It seems though, increasingly, that the ability to avoid addiction is less about pulling one up by one’s own bootstraps, and in many ways determined more by genetics. That is to say, what might have been possible for you is much harder for others.

Look no further than GLP-1. People who have struggled for years - decades - with overeating are almost immediately able to cut back on addictive eating. It’s not that they suddenly discovered willpower. It’s a biochemical effect.

It’s no wonder then that kids are more susceptible to addictive building behaviors. Their minds are pliable and teachable.

Why would we not legislate things that take advantage of that?

ddoolin 3 hours ago | parent | prev | next [-]

Maybe this applies more towards adults, but I don't think the correct answer for kids is only "just have self-control," something kids are notorious for not having. Certainly there's a lot of parental responsibility here but we can simultaneously hold companies responsible for their part too.

ValentinPearce 3 hours ago | parent [-]

It also is a situation where the ubiquity of these companies make it exceptionally difficult for parents to regulate access.

freshtake 2 hours ago | parent [-]

This. Also, technology is ever changing, and expecting parents to constantly keep up with feature rollouts on these platforms is unrealistic.

Personal responsibility IS important, but we also don't allow cigarette companies to advertise on billboards with cute characters (remember Joe Camel?)

simonh 3 hours ago | parent | prev | next [-]

The problem is that internal communications inside these companies raised concerns about the manipulativeness, and even deceptiveness of the algorithms and tactics they were using.

They weren't just consciously creating an attractive platform, they were consciously creating a manipulative platform.

nkrisc 3 hours ago | parent | prev | next [-]

Yes, personal responsibility is important. That doesn't mean we need to allow companies to attempt to addict as many people as they can.

The question we should be asking: are these technologies a net-positive to society?

ValentinPearce 3 hours ago | parent | prev | next [-]

If they are liable of making the thing addictive, it does mean it is their fault. In this case, it specifically says it's designed to be addictive to children, whose personal responsibility is probably not expected.

CarVac 3 hours ago | parent | prev | next [-]

We can't raise other people. We can prohibit the addicting things like newsfeeded Facebook.

pearlsontheroad 3 hours ago | parent | prev | next [-]

Everyone should at least be a conscientious junkie.

imiric 3 hours ago | parent | prev | next [-]

On one hand: sure.

On the other, it's very different when companies explicitly design their products to be as addictive as possible.

We've been through this with Big Tobacco already. Nicotine and other tobacco substances are addictive on their own, but tobacco companies were prosecuted for deliberately making cigarettes as addictive as possible, besides also marketing to children. The parallels with Big Tech and social media are undeniable.

beepbooptheory 3 hours ago | parent | prev [-]

Don't blame yourself! You had an encounter in the world and were greatly affected. Anyone who had the same predisposition and same exposure as you would of fallen in the same situation, just as they would have pulled themselves out of it the same way.

It is not, like, a moral thing to become addicted to something. And the ability to pull yourself out of it is determined, whether you are conscious of it or not, by your broader circumstances and by the same predispositions that brought you there in the first place. At the end of the day we are all fucked up animals reeling from the ongoing consequences of prematurational helplessness..

We should feel together in our problems like this, not distinguish ourselves by how we might individually overcome them. You are not "better" finding yourself standing over a beggar addict, you are lucky, never forget that. If for no other reason that it's not a sustainable world view otherwise, it leads to insecurity, anger, and relapse.

The dark truth of the world is that everyone is doing the best they can. How could they not? Why would they not? What is this thing that separates you from the addict or murderer? Unless you have maybe some spiritual convictions, I can't imagine what it is..

Just really, I know you had a powerful personal journey, but don't let it establish to you that we are all fundamentally alone, because we are not, and its good to help people who maybe need more help.