Remix.run Logo
burlesona 2 hours ago

I think a better solution would be to repeal section 230 protection for any kind of personalized or algorithmic feed. The algorithm makes you a publisher, and you should be liable for what you publish.

That would make it very hard, nigh impossible, for a platform like YouTube or TikTok to exist as it does today, and would instead favor people self-curating mechanisms like RSS readers etc.

mike_hearn an hour ago | parent | next [-]

How is RSS self curating? It's just a way to get a feed from somewhere. And under the maximally external-locus-of-control culture this jury is using, those feeds would themselves be deemed evilly addictive.

There is no solution for this kind of verdict beyond appeal, or changes to the law to rule such suits out, because it's not rooted in any logical or legal principle beyond the idea that people should not be responsible for their own actions (or their children's actions). But there's no limiting factor to that belief. You can't fix it with RSS or federation or making people select who they follow or chronological feeds. Those would just get blamed for "addiction" instead.

burlesona an hour ago | parent [-]

Each blog you follow in the RSS model you opted in to. And each post comes from a person, or a publication, who can be held accountable for what they publish.

Ordinary media, like newspapers, books, radio, and TV, have worked this way forever — people publish “channels” and you decide what channels to follow. A channel can be held accountable.

The algorithm model is different. People just publish “content” into the platform, and the platform makes a custom channel for each viewer, inserting content from people you’ve never heard of and didn’t ask to follow. And it optimizes that custom channel for whatever addicts you the most. That’s fundamentally a different beast than opt-in media consumption.

mike_hearn an hour ago | parent [-]

And if that blog is a newspaper or other aggregator? What makes the RSS feed of the CNN front page fine, but not the RSS feed of the YouTube front page?

There's really no difference. Media companies all aggressively optimize for engagement, often to the point of A/B testing headlines.

burlesona an hour ago | parent [-]

There’s a huge difference. Everyone sees the same front page on CNN, or HN for that matter. Nobody sees the same page twice on YouTube or TikTok. That’s a fundamental distinction between human curated media (even with A/B testing), versus machine curated media.

krapp an hour ago | parent | prev [-]

>and would instead favor people self-curating mechanisms like RSS readers etc.

That isn't what would happen.

What would happen is that only the platforms which can afford legal teams - in other words, the big platforms - would host user posted content under strict arbitration only terms, and every other platform (including Hacker News, which uses an algorithmic feed) would simply not. Removing one of the cornerstones of free speech on the web in favor of regulation will only centralize the web more.

And you wouldn't see mass adoption of "self curating mechanisms" because most people aren't like Hacker News people and would find the premise of having to manually curate data feeds from every they visit to be a tedious waste of their time.

I also think that platforms like Youtube and Tiktok shouldn't be illegal. I don't even think that personalized algorithms should be illegal - it's surprising that one has to point this out on a forum of programmers - but algorithms have no inherent moral dimension and the ability to use an algorithm to find and classify relevant content can be useful. The same algorithm that surfaces extremist content surfaces non-extremist content. The algorithm isn't the problem, rather the content and the policies of these platforms are the problem. And I don't think the solution to either is de facto making math illegal and free speech more difficult.