Remix.run Logo
LeafItAlone 11 hours ago

>Slow down our algorithmic hell hole.

What are your suggestions on accomplishing this while also bent compatible with the idea that government and big tech should not control ideas and speech?

JumpCrisscross 11 hours ago | parent | next [-]

> What are your suggestions on accomplishing this while also bent compatible with the idea that government and big tech should not control ideas and speech?

Time delay. No content based restrictions. Just, like, a 2- to 24-hour delay between when a post or comment is submitted and when it becomes visible, with the user free to delete or change (in this case, the timer resets) their content.

I’d also argue for demonetising political content, but idk if that would fly.

LeafItAlone 11 hours ago | parent [-]

Ok, but how does that get implemented? Not technically, but who makes it happen and enforces the rules? For all content or just “political”? Who decides what’s “political”? Information about the disease behind a worldwide pandemic isn’t inherently “political”, but somehow it became so.

Who decides agar falls in this bucket. The government? That seems to go against the idea of restricting speech and ideas.

JumpCrisscross 11 hours ago | parent [-]

> who makes it happen and enforces the rules?

Congress for the first. Either the FCC or, my preference, private litigants for the second. (Treble damages for stupid suits, though.)

> For all content or just “political”?

The courts can already distinguish political speech from non-political speech. But I don’t trust a regulator to.

I’d borrow from the French. All content within N weeks of an in the jurisdiction. (I was going to also say any content that mentions an elected by name, but then we’ll just get meme names and nobody needs that.)

Bonus: electeds get constituent pressure to consolidate elections.

Alternative: these platforms already track trending topics. So an easy fix is to slow down trending topics. It doesn’t even need to be by that much, what we want is for people to stop and think and have a chance to reflect on what they do, maybe take a step away from their device while at it.

breadwinner 10 hours ago | parent | prev [-]

Easy solution: Repeal Section 230.

Allow citizens to sue social media companies for the harm caused to them by misinformation and disinformation. The government can stay out of this.

JumpCrisscross 10 hours ago | parent [-]

> Easy solution: Repeal Section 230

May I suggest only repealing it for companies that generate more than a certain amount of revenue from advertising, or who have more than N users and have algorithmic content elevation?

breadwinner 10 hours ago | parent [-]

That seems like a reasonable middle ground.