| ▲ | burlesona 2 hours ago | |||||||||||||||||||||||||
I think a better solution would be to repeal section 230 protection for any kind of personalized or algorithmic feed. The algorithm makes you a publisher, and you should be liable for what you publish. That would make it very hard, nigh impossible, for a platform like YouTube or TikTok to exist as it does today, and would instead favor people self-curating mechanisms like RSS readers etc. | ||||||||||||||||||||||||||
| ▲ | mike_hearn an hour ago | parent | next [-] | |||||||||||||||||||||||||
How is RSS self curating? It's just a way to get a feed from somewhere. And under the maximally external-locus-of-control culture this jury is using, those feeds would themselves be deemed evilly addictive. There is no solution for this kind of verdict beyond appeal, or changes to the law to rule such suits out, because it's not rooted in any logical or legal principle beyond the idea that people should not be responsible for their own actions (or their children's actions). But there's no limiting factor to that belief. You can't fix it with RSS or federation or making people select who they follow or chronological feeds. Those would just get blamed for "addiction" instead. | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||
| ▲ | krapp an hour ago | parent | prev [-] | |||||||||||||||||||||||||
>and would instead favor people self-curating mechanisms like RSS readers etc. That isn't what would happen. What would happen is that only the platforms which can afford legal teams - in other words, the big platforms - would host user posted content under strict arbitration only terms, and every other platform (including Hacker News, which uses an algorithmic feed) would simply not. Removing one of the cornerstones of free speech on the web in favor of regulation will only centralize the web more. And you wouldn't see mass adoption of "self curating mechanisms" because most people aren't like Hacker News people and would find the premise of having to manually curate data feeds from every they visit to be a tedious waste of their time. I also think that platforms like Youtube and Tiktok shouldn't be illegal. I don't even think that personalized algorithms should be illegal - it's surprising that one has to point this out on a forum of programmers - but algorithms have no inherent moral dimension and the ability to use an algorithm to find and classify relevant content can be useful. The same algorithm that surfaces extremist content surfaces non-extremist content. The algorithm isn't the problem, rather the content and the policies of these platforms are the problem. And I don't think the solution to either is de facto making math illegal and free speech more difficult. | ||||||||||||||||||||||||||