Remix.run Logo
sazylusan 11 hours ago

Perhaps free speech isn't the problem, but free speech x algorithmic feeds is? As we all know the algorithm favors the dramatic, controversial, etc. That creates an uneven marketplace for free speech where the most subversive and contrarian takes essentially have a megaphone over everyone else.

cptnapalm 10 hours ago | parent | next [-]

As I understand it, Twitter has something called Community Notes. So people can write things, but it can potentially have an attached refutation.

prisenco 10 hours ago | parent [-]

Community notes is better than nothing, but they only relate to a single tweet. So if one tweet with misinformation gets 100k likes, then a community note might show up correcting it.

But if 100 tweets each get 1000 likes, they're never singularly important enough to community note.

cptnapalm 10 hours ago | parent [-]

Fair enough on that. The problem I've seen (and don't have a good idea for how to fix) is on Reddit where the most terminally online are the worst offenders and they simply drown out everything else until non-crazy people just leave. It doesn't help that the subreddit mods are disproportionately also the terminally online.

AfterHIA 10 hours ago | parent | prev | next [-]

I feel that this is the right approach-- the liability and toxicity of the platforms isn't due to them being communication platforms it's because in most practical or technical ways they are not: they are deliberate behavior modification schemes where-in companies are willfully inflaming their customer's political and social sentiments for profit in exchange for access to the addictive platform. It's like free digital weed but the catch is that it makes you angry and politically divisive.

In this sense platforms like X need to be regulated more like gambling. In some ways X is a big roulette wheel that's being spun which will help stochastically determine where the next major school shooting will take place.

prisenco 10 hours ago | parent [-]

Right, engagement algorithms are like giving bad takes a rocket ship.

The words of world renown epidemiologists who were, to be frank, boring and unentertaining could never possibly compete with crunchymom44628 yelling about how Chinese food causes covid.

Bad takes have the advantage of the engagement of both the people who vehemently agree and the people who vehemently disagree. Everyone is incentivized to be a shock jock. And the shock jocks are then molded by the algorithm to be ever more shock jockish.

Especially at a time when we were all thrown out of the streets and into our homes and online.

And here I'll end this by suggesting everyone watch Eddington.

sazylusan 11 hours ago | parent | prev | next [-]

Building on that, the crazy person spouting conspiracy theories in the town square, who would have been largely ignored in the past, suddenly becomes the most visible.

The first amendment was written in the 1700s...

hn_throwaway_99 11 hours ago | parent | prev [-]

Glad to see this, was going to make a similar comment.

People should be free to say what they want online. But going down "YouTube conspiracy theory" rabbit holes is a real thing, and YouTube doesn't need to make that any easier, or recommend extreme (or demonstrably false) content because it leads to more "engagement".

squigz 10 hours ago | parent [-]

Online, sure. But online doesn't mean YouTube or Facebook.