| ▲ | ktosobcy 5 days ago |
| EU should to the same (FB & X). In general anything that has "algorytmic content ordering" that pushes content triggering strong emotional reactions should be banned and burned to the ground. |
|
| ▲ | thinkingtoilet 5 days ago | parent | next [-] |
| It's such an obvious poison. Social media is responsible for the destruction of civility on so many levels. It has destroyed a generations attention span. It is a drug that is more powerful and addictive than something like weed. It seems like people here are too young to remember a life before it. It has transformed society negatively in just a decade. It absolutely should go. I'm glad you did something positive on it. Or found a community. You can still do that without social media. It needs to go. |
| |
| ▲ | ktosobcy 5 days ago | parent | next [-] | | IMHO there were better communities on old forums... | | |
| ▲ | thinkingtoilet 5 days ago | parent [-] | | And it was contained. If you have a small group, you can manage an asshole or two, sometimes it can even be endearing ("he's an asshole, but he's our asshole"). Once the numbers start going up the toxicity increases by orders of magnitude. It's impossible to moderate. The benefits nearly all fall away and the negatives are amplified. Add on the smartest people in the world working very hard to get everyone, including children, addicted to social media and it's fucking nefarious. | | |
| ▲ | diggan 5 days ago | parent | next [-] | | > Once the numbers start going up the toxicity increases by orders of magnitude. It's impossible to moderate. As someone who spent an embarrassingly long time on what lots of people claim to be the most toxic forum in the world (not sure about that, it's the biggest in the Nordics though, that's for sure), and even moderated some categories on that forum that many people wouldn't touch with a ten-foot pole, it really isn't that hard to moderate even when the topics are sensitive and most users are assholes. I'd argue that moderation is difficult today on lots of platforms because it's happening too much "on the fly" so you end up with moderators working with the rules differently and applying them differently, depending on mood/topic/whatever. If you instead make a hard list of explicit rules, with examples, and also establish internal precedents that moderators can follow, a lot of the hard work around moderation basically disappears, regardless of how divisive the topic is. But it's hard and time-consuming work, and requires careful deliberation and transparent ruling. | | |
| ▲ | __s 5 days ago | parent [-] | | I think part of that was volunteer moderation. Were you paid to moderate those boards? Most moderation was a form of community involvement Recent social media (& maybe "recent" no longer applies) doesn't have this kind of community run tooling | | |
| ▲ | diggan 4 days ago | parent [-] | | > Were you paid to moderate those boards? No, none of the moderators were paid, but I do think the ~2/3 admins were paid. But yeah, I did it purely out of the want for the forum to remain high-quality, as did most of the other moderators AFAIK. > Recent social media (& maybe "recent" no longer applies) doesn't have this kind of community run tooling Agree, although reddit with its "every subreddit is basically its own forum but not really" (admins still delete stuff you wouldn't + vice-versa) kind of did an extreme version of community run tooling, with the obvious end result that moderation is super unequal across reddit, and opaque. Bluesky is worth mentioning as well, with their self-proclaimed "stackable" moderation, which is kind of some fresh air in the space. https://bsky.social/about/blog/03-12-2024-stackable-moderati... |
|
| |
| ▲ | threetonesun 5 days ago | parent | prev [-] | | Ah, this reminds me of the one asshole on the old car forum I used to heavily participate in, who would tell new users how dumb all their ideas were for modifying their cars were. And yes, some would argue back, and then someone else would step in and point out all the threads from the cranky asshole where he'd already tried everything they were suggesting. |
|
| |
| ▲ | __MatrixMan__ 4 days ago | parent | prev | next [-] | | What needs to go is advertising. The evils of social media are not consequences of people using the internet to connect with other people, they're consequences of people using platforms where you can buy a following instead of having to earn it. | |
| ▲ | cindyllm 5 days ago | parent | prev [-] | | [dead] |
|
|
| ▲ | wmeredith 5 days ago | parent | prev | next [-] |
| I saw a really good analogy the other day (on X, natch) that said subscribing to modern social media is like inviting a clown to come in your house every 10 minutes and scream, "It's gotten worse". I think about that a lot. Curation goes a long way, but it takes work. |
| |
| ▲ | mrcwinn 5 days ago | parent | next [-] | | Not to the same degree, but I'd argue HN has the same tendencies. Cynical, skeptical, assuming the worst intentions, a bogeyman tech giant hoping to destroy its own customers. Skepticism is, of course, healthy, but the default behavior in this community completely misses the reality that had we frozen progress, say, right near the Apple II launch, we never get HackerNews itself. :) And if you accept my premise, it's probably not the websites, but rather the humans themselves. | | |
| ▲ | yannyu 4 days ago | parent | next [-] | | It's one thing to have a community that has tendencies towards cynicism, skepticism, and assuming the worst. It's another thing to build an algorithm optimized for "engagement" which prioritizes polarizing content above all others because it keeps people addicted to the platform. Maybe the problem is the websites that amplify the most controversial and problematic content because they get the most clicks, so these companies can report better DAUs and MAUs. | | |
| ▲ | realz 4 days ago | parent | next [-] | | Might I add that Facebook has also proven time and time again that they believe in zero ethics. They will happily boost a dictator’s post. They’ll happily assist a rapist win elections. They’ll happily let you sell addictive content to kids. Heck they’ll even give you easy ways to target ads to “depressed 14 year old girls” specifically. | |
| ▲ | int_19h 4 days ago | parent | prev [-] | | The problem is that humans themselves will amplify the most controversial and problematic content because anger is one of the strongest emotions. https://www.youtube.com/watch?v=rE3j_RHkqJc |
| |
| ▲ | hshshshshsh 5 days ago | parent | prev [-] | | Have you worked in a fortune 500? |
| |
| ▲ | fluoridation 5 days ago | parent | prev | next [-] | | It just comes down to how you use it. I use Twitter and BlueSky exclusively to follow artists, and all I see is art. If I didn't come to HN, I don't think I'd hear about any news. | | | |
| ▲ | socalgal2 5 days ago | parent | prev | next [-] | | Exactly why I often think I should stop reading HN | |
| ▲ | op00to 5 days ago | parent | prev | next [-] | | The clown also shows you pictures of how awesome everyone else is doing and asks why you are so fat and ugly and boring in comparison. | |
| ▲ | Mistletoe 4 days ago | parent | prev [-] | | Beautiful description of our current life. |
|
|
| ▲ | godshatter 5 days ago | parent | prev | next [-] |
| I'm not a big fan of banning things like this. There's good mixed in with the bad and banning things will only lead to new social media sites rising in their place. I don't expect them to be any better. This is basically a fight against human nature. If I could get one wish, it would be legislation that forces social media sites to explain in detail how their algorithms work. I have to believe that a company could make a profitable social media site that doesn't try all the tricks in the books to hook their users to their site and rile them up. They may not be Meta-sized, but I would think there would be a living in it. |
| |
| ▲ | strbean 4 days ago | parent | next [-] | | > I'm not a big fan of banning things like this. I think this is a pretty perfect use case for banning. The harms are mostly derived from the business model. If the social media companies were banned from operating them, and the bans were evaded by DIYers, Mastodon and the like, most of the problems disappear. When there's still money in the black market alternative, banning doesn't work well (see: narcotics). | |
| ▲ | op00to 5 days ago | parent | prev | next [-] | | I don’t think people want to understand how algorithms manipulate them. | |
| ▲ | paultnylund 4 days ago | parent | prev [-] | | [dead] |
|
|
| ▲ | txrx0000 4 days ago | parent | prev | next [-] |
| There is immense value in the ability to share realtime events with the rest of the world. If the curation algorithm is the problem, then the solution should target only that, not "BLOW IT ALL UP". There are a few ways: 1) We can build open-source clients with user-configurable client-side recommendation algorithms. 2) We can shame the people actively working to make this problem worse, especially if they make 1) or 3) harder. 3) We can build decentralized protocols like Nostr to pry social media from the hands of tech giants altogether. These solutions are not mutually exclusive, so we should pursue all of them. |
|
| ▲ | rdm_blackhole 4 days ago | parent | prev | next [-] |
| Yes, let's give more power to the EU, the entity that's been trying to ban encryption within the EU for the last 3 years and wants to read all your messages, scan all your pictures, but pinky promise, it won't use the data to hunt down political dissidents or silence opposing views. I am sure it's going to be swell. Let's also require tech companies to only allow content that has been approved by the central committee for peace and tolerance (TM) while we are it! No risk of censorship there. |
| |
| ▲ | Jon_Lowtek 4 days ago | parent [-] | | The EU is not a single mind, it is many party democracy. Yes there are forces in it that have been pushing for "lawful interception" for some time now. And they have always failed to ban E2E-encryption. In the USA there exist similar forces who also introduced bills with similar ideas multiple times in the last decade. One of those is currently in congress. |
|
|
| ▲ | Karrot_Kream 5 days ago | parent | prev | next [-] |
| > pushes content triggering strong emotional reactions should be banned Aren't you describing your own comment? Aren't upvotes pushing that to the top? So isn't HN the thing that needs to be banned according to your comment? |
| |
| ▲ | blargey 4 days ago | parent | next [-] | | The opposite, actually - I remember reading that HN downranks posts that have a low favorability:engagement ratio - in its case, high comment count and comparatively low votes. The reasoning being that flamebait topics inspire a disproportionate number of angry/low-substance/pile-on comments and retort-chains compared to normal topics, without garnering a corresponding increase in top-level votes. It's imperfect, but afaik most social media does the opposite (all "engagement" is good engagement), and I imagine, say, Twitter would be much nicer if it tuned its algo to not propagate posts with an unusually high view/retweet count relative to likes. | | |
| ▲ | setsewerd 4 days ago | parent | next [-] | | That's interesting, it seems like it would accidentally penalize a lot of "good" posts too, like people asking questions to better understand a topic/perspective | |
| ▲ | Karrot_Kream 4 days ago | parent | prev [-] | | [dead] |
| |
| ▲ | abdullahkhalids 5 days ago | parent | prev | next [-] | | No. Facebook algorithm produces different outputs for every user. HN's algorithm produces one output for all users. They are qualitatively distinct. Facebooks' algorithm is demonstrably harmful. HN's not so much. | | |
| ▲ | Karrot_Kream 5 days ago | parent [-] | | Do you have proof that demonstrates that FB's algorithm is more harmful than upvotes on HN or Reddit? Not that it's harmful compared to a world before FB, that it's more harmful than an upvote based algorithm. |
| |
| ▲ | jerrycruncher 4 days ago | parent | prev [-] | | This is a really canonical example of a "Yet you participate in a society. Curious!" post. Well done. [0] https://imgur.com/we-should-improve-society-somewhat-T6abwxn | | |
| ▲ | Karrot_Kream 4 days ago | parent [-] | | Thanks. I try really hard. Wait was that supposed to be a backhanded compliment? No way, can't be, HN is above that kind of behavior (: My point, overall, is that there is all the criticism of social media that excludes HN is based on vibes. And if we're about to ban social media for the EU then hopefully we have more than vibes to go off of. |
|
|
|
| ▲ | bsder 4 days ago | parent | prev | next [-] |
| Agreed. If anyone in the medical community tried the stuff that Facebook and Google do, it would fail immediately at an ethics review board and/or the person would lose their medical license. |
|
| ▲ | plopilop 5 days ago | parent | prev | next [-] |
| Sooo... Should we ban Google too? It is also ordering the contents of its research results with algorithms. Similarly, HN and reddit order the contents of their front page with some algorithms, and in the case of Google and Reddit, the algorithm is personalized with the user's preferences. Or do we only ban websites that design their algorithms to trigger strong emotional emotions? How do you define that? Even Musk doesn't go around saying that the algorithm is modified to promote alt right, instead he pretends it is all about "bringing balance back". Furthermore, I would argue that systems based on votes such as Reddit or HN are much more likely than other systems to push such content. We could issue a regulation to ban specific platforms or websites (TikTok, X...) by naming them individually, but that would probably go against many rules of free competition, and would be quite easily circumvented. Not that I disagree on the effect of social medias on society, but regulating this is not as easy as "let's ban the algorithm". |
| |
| ▲ | ktosobcy 5 days ago | parent [-] | | ERM, FB itself admited they made a research regarding emotional response to the content they show. FB/X modus operandi is keep as much people for as long possible glued to the screen. The most triggering content will awaken all those "keyboard wariors" to fight. So instead of seeing your friends and people you follow on there you would mostly see something that would affect you one way or another (hence proliferation of more and more extreme stuff). Google is going downhill but for different reasons - they also care only about investors bottomline but being the biggest ad-provider they don't care all that much if people spend time on google.com page or not. | | |
| ▲ | plopilop 5 days ago | parent [-] | | Oh, I know that strong emotions increase engagement, outrage being a prime candidate. I have also no issue believing that FB/TikTok/X etc aggressively engage in such tactics, e.g. [0]. But I am not aware of FB publicly acknowledging that they deliberately tune the algorithm to this effect, even though they carried some research on the effects of emotions on engagement (I would love to be proven wrong though). But admitting FB did publicly say they manipulate their users' emotions for engagement, and a law is passed preventing that. How do you assess that the new FB algorithm is not manipulating emotions for engagement? How do you enforce your law? If you are not allowed to create outrage, are you allowed to promote posts that expose politicians corruption? Where is the limit? Once again, I hate these algorithms. But we cannot regulate by saying "stop being evil", we need specific metrics, targets, objectives. A law too broad will ban Google as much as Facebook, and a law too narrow can be circumvented in many ways. [0] https://www.wsj.com/tech/facebook-algorithm-change-zuckerber... | | |
| ▲ | mschuster91 4 days ago | parent [-] | | > But we cannot regulate by saying "stop being evil", we need specific metrics, targets, objectives. Ban any kind of provider-defined feed that is not chronological and does not include content of users the user does not follow, with the exception for clearly marked as-such advertising. Easy to write as a law, even easier to verify compliance. | | |
| ▲ | plopilop 4 days ago | parent [-] | | You have banned Google, Reddit, and HN. | | |
| ▲ | mschuster91 4 days ago | parent [-] | | Google is not social media, Reddit and HN offer ranking based on karma instead of addiction algorithms. | | |
| ▲ | plopilop 3 days ago | parent [-] | | None of these were in your initial law.
Furthermore, karma is also addictive. |
|
|
|
|
|
|
|
| ▲ | rasmus-kirk 5 days ago | parent | prev | next [-] |
| I like this, but it also leaves the door wide open to censorship. Also this would include Youtube which would be a marked detirioration in learning. |
| |
| ▲ | Krssst 4 days ago | parent [-] | | We can have freedom of expression with a regular chronological feed from selected followed users. There's no need for a smart feed that optimises whatever the entity owning the network wants. |
|
|
| ▲ | amelius 4 days ago | parent | prev | next [-] |
| Let's start with banning the monetization model. |
|
| ▲ | nradov 5 days ago | parent | prev | next [-] |
| Fortunately the US federal government is standing up for the interests of US tech companies, and for the principle of free speech. They won't let the EU get away with such an extreme authoritarian move. |
| |
| ▲ | a_ba 5 days ago | parent | next [-] | | This administration is not standing up for the principles of free speech. It has violated this principle numerous times in action and in spirit. | |
| ▲ | pessimizer 5 days ago | parent | prev | next [-] | | > for the principle of free speech This administration is taking a newly-formed censorship regime that was largely operated by the nepo babies of politicians running do-nothing tax-supported nonprofits, but implemented and operated by Mossad agents, and removing the nepo babies from the loop. You can say "retard" now, but if you call somebody who executes Palestinian children a retard, you're going on a government blacklist. edit: This post has been classified and filed, and associated with me for the rest of my life. | |
| ▲ | jajko 5 days ago | parent | prev | next [-] | | Interest of tech companies (or more specifically their stockholders), for sure. Not so much for the long term interests of its citizens though. | |
| ▲ | myvoiceismypass 5 days ago | parent | prev | next [-] | | > for the principle of free speech Indeed. You are free to praise the president or face the consequences. Some freedom. | |
| ▲ | int_19h 4 days ago | parent | prev | next [-] | | I'm very skeptical of EU censorship, but EU citizens can and should figure it out for themselves. There's no reason why we Americans should be telling them how to run their economies, nor do we have some intrinsic right for our companies to operate in any random market. | |
| ▲ | ktosobcy 5 days ago | parent | prev | next [-] | | Can the US and ef-of and keep this civil and social enshitification to itself? The rest of the world would be very happy if the US would finally put the wall around itself and stopped meddling with every darn scrap of the world... | |
| ▲ | maleldil 5 days ago | parent | prev | next [-] | | > standing up for the interests of US tech companies Imagine if they stood up for the interests of citizens instead. | |
| ▲ | miltonlost 5 days ago | parent | prev [-] | | Lol a content algorithm is not free speech | | |
| ▲ | krapp 5 days ago | parent [-] | | All software is free speech, end of. It's insane that the same community that rails against attempts to police encryption, that believes in the ethos of free software, that "piracy isn't theft" and "you can't make math illegal" and that champions crypto/blockchain to prevent censorship is so sympathetic to banning "content ordering algorithms." The problem is not the algorithms, the problem is the content, and the way people curate that content. Platforms choosing to push harmful content and not police it is a policy issue. Is the content also free speech? Yes. But like most people I don't subscribe to an absolutist definition of free speech nor do I believe free speech means speech without consequences (absent government censorship) or that it compels a platform. So I think it's perfectly legitimate for platforms to ban or moderate content even beyond what's strictly legal, and far less dangerous than having governments use their monopoly on violence to control what sorting algorithms you're allowed to use, or to forcibly nationalize and regulate any platform that has over some arbitrary number of users (which is something else a lot of people seem to want.) We should be very careful about the degree of regulation we want governments to apply to what is in essence the only free mass communications medium in existence. Yes, the narrative is that the internet is entirely centralized and controlled by Google/Facebook/Twitter now but that isn't really true. It would absolutely become true if the government regulated the internet like the FCC regulates over the air broadcasts. Just look at the chaos that age verification laws are creating. Do we really want more of that? | | |
|
|
|
| ▲ | 5 days ago | parent | prev | next [-] |
| [deleted] |
|
| ▲ | tomp 5 days ago | parent | prev | next [-] |
| Why would "algorithmic" outrage-porn content (X, Meta) be any worse than human-ordered outrage-porn content (news websites)? |
| |
| ▲ | ktosobcy 5 days ago | parent [-] | | News websites are regulated… | | |
| ▲ | socalgal2 5 days ago | parent [-] | | they are? As far as I can tell they are no more regulated than anyone else. There in the issue that a news site generally has limited number of contributors where has a social media site has an infinite number of contributors. In either case, it seems like the same laws apply like defamation laws, fraud laws, etc apply to the authors of the posts which might be easier to target when it’s a news site as the site itself takes responsibility for the content | | |
| ▲ | ktosobcy 5 days ago | parent [-] | | Yes, they are (not sure about US). In general the mere fact that there is limited number of contributors that are known and indicated authorship goes a long way. Also - all publishers have to register indicating who is behind particular "medium". Contrary, social-"media" there is no accountability. Anyone can publish anything and there is basically no information who published that. You can sue but then again publishing platform has no information about the author so the process is long and convoluted. Making social-media what it started from (network of close friends) where you only see the content they publish and requirement of actual details who is behind the particular profile (could be for pages/profiles with more than something like 10k followers, in which case - let's be honest - it's not "friend" at that point) would go a long way. |
|
|
|
|
| ▲ | eviks 5 days ago | parent | prev | next [-] |
| Only with all the censors as kindling! |
|
| ▲ | _mlbt 5 days ago | parent | prev | next [-] |
| [flagged] |
| |
| ▲ | sniffers 5 days ago | parent | next [-] | | "We should kill/imprison peoples who have (immutable characteristic)" is hardly just a "mean thing people post". There's mean content ("I think you are an asshole") and there's content that's going to cause actual harm because it either goads others to violence or because it creates a constant cortisol increase from fear and dehumanization. | |
| ▲ | daoboy 5 days ago | parent | prev | next [-] | | I strongly agree that free speech is crucial, but the first part of your statement is in direct opposition to the second. People stating their perspectives and arguing against other's with complete disregard for civility (or being 'mean' as you said), makes it far more difficult for people to respect opposing viewpoints. | | |
| ▲ | spacebanana7 5 days ago | parent [-] | | On the contrary, I think a meaningful part of the population is incapable of digesting ideas without them being coupled to conflict. You don't need to respect opposing viewpoints in order to engage with them. For such people everything must be framed in a good versus evil, us vs them or generally sensationalist manner to sustain any kind of attention. |
| |
| ▲ | alistairSH 5 days ago | parent | prev | next [-] | | The problem, to me anyway, is that FB etc don't serve me the opinions of people I know and want to engage with. Instead, they serve me a stream of content specifically tailored to annoy me and get the dopamine hit and make me react. Of course, my solution was to stop using those services. But, I wouldn't be surprised if certain personality types are unable to do that (same as they can't quit smoking or porn or whatever else). | |
| ▲ | ktosobcy 5 days ago | parent | prev [-] | | How do you learn to "deal and tolerate" people that constantly spit in your face? Especially if you try to avoid them but the platform does everything in its power to stear them your way? It's basically a dark entity that cranks up footbal hooligans and then push them on the collision course. There is no civility there. |
|
|
| ▲ | richwater 5 days ago | parent | prev [-] |
| [flagged] |
| |
| ▲ | dang 4 days ago | parent | next [-] | | Could you please stop posting flamebait, ideological battle comments, etc? You've unfortunately been doing it repeatedly. It's not what this site is for, and destroys what it is for. If you wouldn't mind reviewing https://news.ycombinator.com/newsguidelines.html and taking the intended spirit of the site more to heart, we'd be grateful. | |
| ▲ | woodpanel 5 days ago | parent | prev [-] | | [flagged] | | |
|