Remix.run Logo
riazrizvi 5 hours ago

I think there's an axis of perceived wrongdoing here, and you and I fall on different points. Yours is more extreme, you say Meta was doing broad harm by exploring this activity, and want to see greater damages to scare other businesses off from the general territory of addictive interfaces. Mine is where we want businesses to continue to explore and develop 'sticky', compelling, user experiences but Meta went too deep in some specific ways.

EDIT: I see I'm mixing up the New Mexico case yesterday on sexploitation with the addiction case in Los Angeles I thought we were talking about here.

munk-a 4 hours ago | parent | next [-]

To start off with my personal beliefs... I agree - I see a much broader harm in how platforms try and make themselves addictive as I've worked on such systems in the past. I think the public and even most technical folks that aren't deep into engagement metrics underestimate how studied the field has been and how many iterations of approaches to daily engagement reminders, friction removal and FOMO have been worked through to get to the point we're at today. In my opinion, which absolutely isn't fact, this work is broadly unproductive at improving our daily lives - I can understand that there are some compelling counter arguments that these developments can be harnessed for good but I don't share them.

But, specific to this article and ignoring my personal beliefs - I still find this judgement to be severely lacking. I don't think this judgement is nearly noticeable enough to Meta to actually provide a significant impact on the way they do business outside of tidying up some specifically egregious corners and making sure they internally communicate moving forward in a way that appears to comply with the judgement. The judgement was enough when applied to this pool of users to make these specific users unprofitable in retrospect (e.g. Meta would have more money if it had refused to even do business with these users) but I'm also concerned that the pool of considered victims was so narrow that it excluded a significant number of similarly harmed victims and that the amortized damages end up being negligible.

riazrizvi 4 hours ago | parent [-]

I guess we have deep deep divisions on what everyone is doing in society, and what makes a 'good' society.

As I've aged, I've entered new-to-me territory where a good society needs to reflect the world as it is, so that its members have high survivability.

At the local family level for instance. When my kids were young. I had dreams of being super financially successful so that I could give them lots of nice things. I just don't want that for them anymore. Protection, and pandering, does not make a good lineage IMO. It's something of a leap I'm asking of you to connect this to my position here on Meta, but I've got other work to do, and I hope it's enough to convey my point.

gusgus01 4 hours ago | parent | prev | next [-]

This was about Meta's platforms not doing enough to protect children from sexual material (and allegedly ignoring employee warnings and lying to the public about it), not intrinsically their addictive interface and compelling user experience. I suppose the actions necessary to protect children from exposure to sexual material/exploitation could limit their ability to make certain changes to their platform, eg tighter moderation would reduce the amount of content that could be uploaded, but they could also have just not allowed children on the platform (like how Facebook started) and then not worried about child exploration?

4 hours ago | parent [-]
[deleted]
ethanwillis 3 hours ago | parent | prev [-]

in what specific ways did it go too deep? it's hard to understand when you're being so vague.