| ▲ | slg 6 hours ago |
| >and then prioritize for outrage and emotionalism This isn’t inherent to social networks though. It is a choice by the biggest social media companies to make society worse in order to increase profits. Just give us a chronological feed of the people/topics we proactively choose to follow and much of this harm would go away. Social media and the world were better places before algorithmic feeds took over everything. |
|
| ▲ | themanmaran 19 minutes ago | parent | next [-] |
| I think it's easy to blame the evil profit maximizing social media companies. But IMO even the most simple 'engagement' algorithm will produce negative externalities. Regardless of who's running it. ```
show_me_posts_people_like_me_have_liked() - John saw 20 posts today and liked 9 of them. - Cliff saw 20 posts today and liked 9 of them - Jeff and Cliff had 6 overlapping likes - Show Jeff the 2 extra posts Cliff liked; show Cliff the 2 extra posts Jeff liked ``` This seems like a simple / logical recommendation system. BUT the end result is that you make Jeff and Cliff closer to the same person over time. And times millions, you build echo chambers. And the biggest echo chambers (often those aligned with some identity politics) see they have a huge community and want to expand it. Making the whole platform worse as a byproduct. |
|
| ▲ | dathinab 6 hours ago | parent | prev | next [-] |
| > This isn’t inherent to social networks though. It is a choice by the biggest social media companies and to make society worse in order to increase profits. going beyond social media it's IMHO the side effect of a initially innocent looking but dangerous and toxic monetization model which we find today not just in social media but even more so in news, apps and most digital markets |
| |
| ▲ | betty_staples 2 hours ago | parent | next [-] | | I will say I am strongly against what social media algorithms do. But I am fascinated by the black mirror element. They said "hey algorithm what gets likes, what gets views - look into human nature and report back" - "human nature shows polarization does! emotional charged divisive content!" - that's just fascinating. Not learning, philosphophy, growth, education, health, no, the naughty stuff, the bad stuff. That's not to exonerate social media companies, no, such would be the same as exonerating big tobacco for what they do. | | | |
| ▲ | trevwilson 6 hours ago | parent | prev | next [-] | | Yeah unfortunately this seems to be a common, if not inevitable, result of any product where "attention" or "engagement" are directly correlated with profitability. | |
| ▲ | slg 6 hours ago | parent | prev [-] | | And if we want to go beyond that, we really just have to blame capitalism. What happens when you build a society around the adversarial collection of money? You get a society that by and large prioritizes making money above all else including ethics and morals. | | |
| ▲ | makingstuffs 4 hours ago | parent | next [-] | | That and the fact that money and media presence is essentially what wins elections. The only way we can really have democracy is with a truly informed populace and the only way people can make a truly informed vote without all the noise is to have anonymous voting. By which I mean you do not know which politician/party you are voting for, you just know the policies they have promised to enforce. Further to that, there needs to be accountability. Right now, in the UK at least, governments are not held to account, at all. They get into office with grand promises of flying elephants and golden egg laying geese but obviously never follow through with said promises. The populace, ultimately, just shrugs it off with ‘politicians lie’ and continue complaining about it within their social circles. Our political systems are fundamentally broken. We shouldn’t care if policies are from party A or party B. All that should matter is the content of the policy and whether it is ever actually materialised. Right now we have a situation where people are manipulated left, right and centre into believing a given party’s absolute BS manifesto which they write under the full knowledge that not delivering will have very little impact on them as they’ve just had a substantial amount of time getting paid lucrative salaries to essentially argue with a bunch of other liars in a shouting match on tele. Remove the football-esque fandom which applies to political parties by removing any ability to publicly affiliate any given person with said party and I’d bet we see different results across the bar. Remove all this absolute nonsense of politicians promoting their ideologies on TV/Twotter etc and you will remove a lot of the brainwashing which happens. Remove the most corrupt situation of all: private firms and individuals being able to fund political parties and you level the playing field. Obviously this is a hard pill for many to swallow as no one likes to be told they’ve essentially been brainwashed into their thoughts and ego is everything in modern society. | |
| ▲ | Zigurd 5 hours ago | parent | prev | next [-] | | It's the combination of software that is infinitely malleable, and capitalism. Successful entrepreneurs in software want liquidity. So no matter how benevolent they start out being, they eventually lose control and the software gets turned into an exploitative adversary to satisfy investor owners. This is fine if you can refuse the deal. Lots of software and the companies selling it have died that way. But if you've made a product addictive or necessary for everyday survival, you have the customer by the short hairs. The technology underlying Bluesky is deliberately designed so that it's hard to keep a customer captive. It will be interesting to see if that helps. | |
| ▲ | dathinab 5 hours ago | parent | prev [-] | | yes but it's more complicated like if you look at original reasoning why capitalism is a good match for democracy you find arguments like voting with money etc. _alongside with what things must not be tolerated in capitalism_ or it will break. And that includes stuff like: - monopolies, (or more generic anything having too much market power and abusing it, doesn't need to be an actual monopoly) - unfair market practices which break fair competition - situations which prevent actual user choice - to much separation of the wealth of the poorest and richest in a country - giving to much ways for money to influence politics - using money to bare people from a fair trail/from enforcing their rights - also I personally would add in-transparency, but I think that only really started to become a systemic issue with globalization and the digital age. This also implies that for market wich have natural monopolies strict regulation and consumer protection is essential. Now the points above are to some degree a check list of what has defined US economics, especially in the post-Amazone age (I say post Amazone age as the founding story of Amazone was a mile stone and is basically the idea of "let's systematically destroy any fair competition and used externally sourced money (i.e. subsidization) to forcefully create a quasi monopoly", and after that succeeded it became somewhat of the go-to approach for a lot of "speculative investment" founding). Anyway to come back to the original point. What we have in the US has little to do with the idea of capitalism which lead to the adoption of it in the West. It's more like someone took it is twisting it into the most disturbing dystopian form possible, they just aren't fully done yet. | | |
| ▲ | slg 5 hours ago | parent [-] | | >- giving to much ways for money to influence politics I think what we're learning is that mass (social) media means that this simply isn't preventable in a world with free speech. Even if the US had stricter campaign finance laws in line with other western democracies, there still needs to be some mechanism so that one rich guy (or even a collection of colluding rich guys) can't buy a huge megaphone like Twitter or CBS. As long as there is no upper limit on wealth accumulation, there is no upper limit on political influence in a capitalistic democracy with free speech. Every other flaw you list is effectively downstream of that because the government is already susceptible to being compromised by wealth. |
|
|
|
|
| ▲ | frogpelt 3 hours ago | parent | prev | next [-] |
| Before there was social media there was click bait headlines from supposedly reputable news agencies. Social media gave people easy ways to engage and share. And it turns out what people engage with and share is click bait/rage bait. So maybe not technically inherent but a natural consequence of creating networks for viral sharing of content. |
|
| ▲ | ambicapter 2 hours ago | parent | prev | next [-] |
| The existence of "yellow journalism" in the 19th century would disagree with that statement. That outrage and emotionalism trigger human's attention more so than other feelings is a biological fact that has been exploited for centuries and centuries. The same way gambling has been around for the entirety of recorded human history. It's a default behavior pattern installed in every humans, some can override, most don't. |
| |
|
| ▲ | numpad0 4 hours ago | parent | prev | next [-] |
| >> Social media itself is a grand experiment. What happens if you start connecting people from disparate communities, and then prioritize for outrage and emotionalism?
> It is a choice by the biggest social media companies to make society worse in order to increase profits.
I think there can be more pointy way to frame this ongoing phenomenon, such as that, the US invested in social media assuming it'll be the mainstay of its cultural dominance into the 21st century and it wasn't, but more of a giant oil pipeline with a check valve for US to be completely prone to East Asian influence, and it's scrambling at damage control.US as it is has no cultural industrial base to produce social media contents. East Asian contents, if not East Asian indigenous social media, easily win the Internet leveraging universally strong public education, without even being intentional. That's what happened, and that must be the intent of shift into rage political shows which the US/EU can at least produce, even if it weren't useful. |
| |
| ▲ | ChrisMarshallNY 3 hours ago | parent [-] | | I’m not sure that’s especially new. In the 1990s, we had Japanophiles, in the US. Right now, Korea is taking the crown. We’ll have to see who’s next. Might be China, but their culture is so controlled, that it might not happen. Russia was starting to have a lot of cultural influence, until Iron Curtain 2.0 slammed down. Viral culture requires a certain amount of freedom of expression, along with access to media. |
|
|
| ▲ | protocolture an hour ago | parent | prev | next [-] |
| Yeah its crazy it hasnt been around long yet I yearn for the old days of even 10 years ago when my feed was still mostly things my friends are doing. The problem with social media is that it has gone off book. |
| |
| ▲ | energy123 an hour ago | parent [-] | | I posit that the growing anti-immigration sentiment partly comes from informational borders being eliminated by social media. People want some of that friction back but can't put their finger on where the unease is coming from. I further posit that this partly explains why Trump's approval rating around immigration are very negative despite successfully stopping immigration, which was one of the issued he polled the best on. He took away immigration, but the unease and insecurity from having zero information borders remains. (Yes there are other factors too especially around ICE conduct). |
|
|
| ▲ | fwipsy 43 minutes ago | parent | prev | next [-] |
| How do you know the casuality isn't reversed? Maybe the social media prioritizing outrage became the biggest. Don't hate the player, change the game. |
|
| ▲ | energy123 an hour ago | parent | prev | next [-] |
| Your post needs to be absorbed and spread by everyone. The public debate has given us a false choice between censorship and no censorship. It's the wrong dimension. |
|
| ▲ | skybrian 5 hours ago | parent | prev | next [-] |
| It sure seems inherent to me. You get outrage and emotionalism even in small Internet forums. Moderation is necessary to damp it down. |
|
| ▲ | dylan604 6 hours ago | parent | prev | next [-] |
| bigMedia has been doing this longer than the socials. The socials just took the knob and turned it to 11. |
|
| ▲ | Aurornis 5 hours ago | parent | prev [-] |
| > Social media and the world were better places before algorithmic feeds took over everything Some times I feel like I'm the only one who remembers how toxic places like Usenet, IRC, and internet forums were before Facebook. Either that, or people only remember the past of the internet through rose colored glasses. Complain about algorithmic feeds all you want, but internet toxicity was rampant long before modern social media platforms came along. Some of the crazy conspiracy theories and hate-filled vitriol that filled usenet groups back in the day makes the modern Facebook news feed seem tame by comparison. |
| |
| ▲ | linguae 5 hours ago | parent | next [-] | | I agree that there’s always been toxicity on the Internet, but I also feel it’s harder to avoid toxicity today since the cost of giving up algorithmic social media is greater than the cost of giving up Usenet, chat rooms, and forums. In particular, I feel it’s much harder to disengage with Facebook than it is to disengage with other forms of social media. Most of my friends and acquaintances are on Facebook. I have thought about leaving Facebook due to the toxic recommendations from its feed, but it will be much harder for me to keep up with life events from my friends and acquaintances, and it would also be harder for me to share my own life events. With that said, the degradation of Facebook’s feed has encouraged me to think of a long-term solution: replacing Facebook with newsletters sent occasionally with life updates. I could use Flickr for sharing photos. If my friends like my newsletters, I could try to convince them to set up similar newsletters, especially if I made software that made setting up such newsletters easy. No ads, no algorithmic feeds, just HTML-based email. | |
| ▲ | eddythompson80 4 hours ago | parent | prev | next [-] | | You’re absolutely right. Shocking, rage bait, sensational content was always there in social media long before algorithmic feeds. As a matter of fact “algorithmic feeds” were in a way always there it’s just that back in the day those “algorithms” were very simple (most watched/read/replies today, this week, this month. Longest, shortest, newest, oldest, etc) I think the main thing algorithmic feeds did was present the toxicity as the norm, as opposed to it being a choice you make. Like I used to be part of a forum back in the early 2000s. Every few weeks the top most replied thread would be some rage bait, or sensational thread. those threads will keep getting pushed to the top and remain at the top of the forum for a while and grow very quickly as a ton of people keep replying and pushing it to the top. But you could easily see that everyone else is carrying on with their day. You ignore it and move on. You sort by newest or filter it out and you’re good. It was clear that this is a particular heated thread and you can avoid it. Also mods would often move it to a controversial sub forum (or lock it all together if they were heavy handed) So you sort of had to go out of your way to get there and then you would know that you are actively walking into a “controversial section” or “conspiracy” forum etc. It wasn’t viewed as normal. You were a crazy person if you kept linking and talking about that crazy place. With algorithmic feeds, it’s the norm. You’re not seeking and getting to shady corners of the internet or subscribing to a crazy usenet newsgroup to feed your own interest in rage or follow a conspiracy. You are just going to Facebook or twitter or Reddit or YouTube homepage. Literally the most mainstream biggest companies in the US homepages. Just like every one else. | |
| ▲ | intended an hour ago | parent | prev | next [-] | | Having moderated both PHP forums and SM sites, quantity is its own quality. Not to mention we have adversaries to contend with now. I still remember seeing Palantir slides for sock puppet management tools way back in the day. That was the SOTA at one point. Today? SM pushed connected humanity past a critical connected mass that Usenet and IRC never could. | |
| ▲ | csnover 4 hours ago | parent | prev [-] | | You aren’t the only one who remembers. But in that time it was a self-selecting process. The problem with “the algorithm”, as I see it, is not that it increases the baseline toxicity of your average internet fuckwad (though I do think the algorithm, by seeking to increase engagement, also normalises antisocial behaviour more than a regular internet forum by rewarding it with more exposure, and in a gamified way that causes others to model that antisocial behaviour). Instead, it seems to me that it does two uniquely harmful things. First, it automatically funnels people into information silos which are increasingly deep and narrow. On the old internet, one could silo themselves only to a limited extent; it would still be necessary to regularly interact with more mainstream people and ideas. Now, the algorithm “helpfully” filters out anything it decides a person would not be interested in—like information which might challenge their world view in any meaningful way. In the past, it was necessary to engage with at least some outside influences, which helped to mediate people’s most extreme beliefs. Today, the algorithm successfully proxies those interactions through alternative sources which do the work of repackaging them in a way that is guaranteed to reinforce, rather than challenge, a person’s unrealistic world view. Many of these information silos are also built at least in part from disinformation, and many people caught in them would have never been exposed to that disinformation in the absence of the algorithm promoting it to them. In the days of Usenet, a person would have to get a recommendation from another human participant, or they would have to actively seek something out, to be exposed to it. Those natural guardrails are gone. Now, an algorithm programmed to maximise engagement is in charge of deciding what people see every day, and it’s different for every person. Second, the algorithm pushes content without appropriate shared cultural context into faces of many people who will then misunderstand it. We each exist in separate social contexts with in-jokes, shorthands for communication, etc., but the algorithm doesn’t care about any of that, it only cares for engagement. So you end up with today’s “internet winner” who made some dumb joke that only their friend group would really understand, and it blows up because to an outsider it looks awful. The algorithm amplifies this to the feeds of more people who don’t have an appropriate context, using the engagement metric to prioritise it over other more salient content. Now half the world is expressing outrage over a misunderstanding—one which would probably never have happened if not for the algorithm boosting the message. Because there is no Planet B, it is impossible to say whether things would be where they are today if everything were the same except without the algorithmic feed. (And, of course, nothing happens in a vacuum; if our society were already working well for most people, there would not be so much toxicity for the algorithm to find and exploit.) Perhaps the current state of the world was an inevitability once every unhinged person could find 10,000 of their closest friends who also believe that pi is exactly 3, and the algorithm only accelerated this process. But the available body of research leads me to conclude, like the OP, that the algorithm is uniquely bad. I would go so far as to suggest it may be a Great Filter level threat due to the way it enables widespread reality-splitting in a geographically dispersed way. (And if not the recommendation algorithm on its own, certainly the one that is combined with an LLM.) |
|