| ▲ | jmyeet 2 days ago | |
I believe there's a Chicxulub level meteor headed for social media and it's not addiction. It's liability. We, as a society, don't really care about addiction. That's reflected in our government. Gambling, nicotine, alcohol, drugs, etc. Remember with tobacco it was the harm not the addiction that was their undoing. Core to all of this is what's colloquially become known as The Algorithm. Google in particular has sucessfully propagandized this idea that The Algorithm is a neutral black box over which we have no influence (for search). But every feature and behavior of any kind of recommendation or ranking or news feed algorithm is the result of a human intentionally or negligently creating that behavior. So one thing most of us here should be aware of is to get more distribution for a post or a video or whatever is through engagement. That is likes, comments, shares, reposts, quotes and so on. All these companies measure those and optimize for engagement. That sounds neutral and possibly harmless but it's not and I think it's foreseeably not harmless and no doubt there's evidence along the way to demonstrate that harm. We've seen this with some very harmful ideas that get a lot of traction online. Conspiracy theories, antivaxxer nonsense, doxxing queer people, swatting, the manosphere and of course eating disorders. ED content has a long history on the Internet and you'll find pro-ana or "thinspiration" sites and forums going back to the 1990s. So I think social media sites are going to have three huge problems going forward: 1. That they knowingly had minors (and children under 13, which matters for COPPA) on their platforms and they profited from that by knowingly or negligently selling those audiences to advertisers; 2. They knew they had harmful content on their platforms but hid Section 230 in particular as simply being the host for third-party content. I believe that shield is going to fail; and 3. They knowingly or negligently pushed that content to children to increase overall engagement. One clue to all this is you see Mark Zuckerberg who wants to push age verification into the OS. Isn't that weird? The one company that doesn't have an OS thinks the OS should handle that or, more specifically, should be liable for age verification? That's so strange. In an era where we have LLMs (and the systems that came before) that can analyze posted content (including video) and derive features about that content you don't get to plead ignorance or even user preference. These companies will be held liable for the harm caused by content they distribute. | ||