Remix.run Logo
wkat4242 2 days ago

> Also, studies have shown people focus more on negative (https://en.wikipedia.org/wiki/Negativity_bias) and sensational (https://en.wikipedia.org/wiki/Salience_(neuroscience)#Salien...) things (and thus post/upvote/view them more), so an algorithm that doesn't explicitly push negativity and sensationalism may appear to.

This is exactly why it's a problem. It doesn't even matter whether the algorithm is trained specifically on negative content. The result is the same: negative content is promoted more because it sees more engagement.

The result is more discontent in society, people are constantly angry about something. Anger makes a reasonable discussion impossible which in turn causes polarisation and extremes in society and politics. What we're seeing all over the world.

And the user sourced content is a problem too because it can be used for anyone to run manipulation campaigns. At least with traditional media there was an editor who would make sure fact checking was done. The social media platforms don't stand for the content they publish.

nradov 2 days ago | parent | next [-]

Fact checking with traditional media was always pretty spotty. Even supposedly high quality publications like the NY Times frequently reported fake news.

bluGill 2 days ago | parent | prev [-]

It isn't just social media. I'm been identified as a republican and the pervious owners of my house democrats, and since forwardinu has expired I get their 'spam' mail. There names are different, but otherwise the mail from each party is exactly the same 'donate now to stop [other parties'] evil ageneda. they know outrage works and lean into it.