| ▲ | cosmic_cheese 4 days ago |
| The hyper-polarization is probably preventable, in my estimation. The main thing a social network would need to do is to stymy the flywheel effect that allows a handful of users (and thus sets of norms) to come to dominate so strongly. That might mean something along the lines of a system that puts a hard cap on the reach any profile or topic can have, and when engagement exceeds the triggering threshold, reach actually tapers off proportionate to how far the threshold is exceeded. In theory this would naturally elevate posts that are more measured and mundane while sinking posts with big emotional lizard brain appeal (by design or otherwise). With time this would establish a self-reinforcing norm that makes polarized and inflammatory posts look as clownish as they actually are. |
|
| ▲ | RiverCrochet 4 days ago | parent | next [-] |
| I think a lot of social network problems would be solved if platforms put an orange flag next to profiles that have posted more than 10 times in the last 24 hours, and a red flag next to profiles that have posted more than 60 times in the last 7 days. The total number of flags ever given to an account on the bio would be good as well. No other automatic action, just a visible flag or other symbol. Being able to temporarily filter out profiles that post too many times (a setting you could change) would also be nice, but it shouldn't be automatic. |
| |
| ▲ | skybrian 4 days ago | parent | next [-] | | Bluesky has a “quiet posters” feed that I find useful. | | |
| ▲ | dhosek 4 days ago | parent [-] | | It’s somebody’s side-thing, I think and not official Bluesky, but yes, that’s become my primary feed for Bluesky. Following is my secondary and I almost never look at discover or popular with friends. |
| |
| ▲ | cosmic_cheese 4 days ago | parent | prev [-] | | Not a bad idea. It may also be good to distinguish replies and reposts from unique timeline posts, with “reply guys” consistently being some of the most notorious individuals. |
|
|
| ▲ | skybrian 4 days ago | parent | prev | next [-] |
| I think it’s more about not taking posts out of context. Communities need boundaries between them. Substack and other blogging tools are good this way. For Bluesky, the problem is that the replies to someone you follow can be pretty bad. (Official Bluesky posts are an example of this.) People can filter them individually, but it’s not the same as a blog with good moderation. I don’t think I could do a whole lot if the replies to one of my Bluesky posts were bad? |
| |
| ▲ | dhosek 4 days ago | parent [-] | | But blocking on Bluesky works better than it did on Twitter. If you post a crummy reply to me and I block you, nobody sees your reply. There are a few other small differences between Bluesky and Twitter that really do a lot to cut down on the pile-on effect that’s common at Twitter. | | |
| ▲ | skybrian 3 days ago | parent [-] | | That’s good, but there are low-information posts where a hard block on first offense is kind of harsh. | | |
| ▲ | dhosek 3 days ago | parent [-] | | Nah, I have no obligation to interact with anyone I don’t want to. | | |
| ▲ | xdennis 3 days ago | parent [-] | | That's true, but the result is a boring platform where everyone agrees on everything, and even the most minor disagreement will get you blocked. This post by a user who discovered that he instantly got 30000 blockers simply by joining and following some starter packs of journalists IS HILARIOUS: https://www.reddit.com/r/BlueskySocial/comments/1mgz19y/why_... | | |
| ▲ | KingMob 3 days ago | parent [-] | | It's important to distinguish between the blocklists and the general blocking functionality of Bluesky. The blocklists, as an experiment, are too easily gamed or abused. (I never use them.) List maintainers have added people they have personal beef with, and bad actors have started deceptive lists that change after enough people follow. But the general block/mute functionality on Bsky is way better than most social media, and goes a long way to avoiding abusive or unpleasant people. |
|
|
|
|
|
|
| ▲ | gamacodre 4 days ago | parent | prev | next [-] |
| A recent study[1] seems to indicate that polarization is a hard problem, along with some of the other negative effects of social media. Many of the commonly suggested solutions have minimal impact, or no effect at all. That flywheel effect is surprisingly robust. [1] https://arxiv.org/abs/2508.03385 |
| |
| ▲ | cosmic_cheese 3 days ago | parent [-] | | I saw that, but the approach taken is questionable (do LLMs represent realistic behavior for scenarios they’ve not been trained for?) and it also doesn’t seem like anything like my suggestions here were tested. It’s better than nothing, but far from conclusive in my opinion. |
|
|
| ▲ | rectang 4 days ago | parent | prev [-] |
| > stymy the flywheel effect that allows a handful of users (and thus sets of norms) to come to dominate so strongly This prevents certain communities from forming and certain topics from being discussed. For example, you can't discuss LGBTQ issues with troll armies constantly swarming and spamming. If such communities are not given tools to exclude malignant disruptors by setting norms and "dominating" a given channel, they will have to go elsewhere (such as leaving X for BlueSky). |
| |
| ▲ | Levitz 3 days ago | parent | next [-] | | Problem is that then the communities do form, but automatically radicalize. Truth Social and Bluesky users are in similar bubbles, just in opposite sides of the spectrum. | | |
| ▲ | rectang 3 days ago | parent [-] | | So what? Should they not exist? Why must marginalized communities leave themselves defenseless and accept that they can only have a conversation among themselves in the midst of a hurricane of abuse? | | |
| ▲ | Karrot_Kream 3 days ago | parent | next [-] | | Personally I think media like Bluesky are not suited to form these forms of closed communities. There's nothing wrong with closed/gated community and I think it's especially important for marginalized communities as you mention, but I think in 2025 you could do that with a Discord "server", Discourse forum, or a non-federated Lemmy instance. The problem with folks like this on Bluesky and X is that they want to both have a closed community but also benefit from the easy comings and goings that a more open forum offers. IMO it's a fools errand. There's a reason why the humble group chat has won as the social media of choice for, well, everyone. | |
| ▲ | simianwords 3 days ago | parent | prev [-] | | Because they can get so dominant as to alienate normal users. Like in Reddit. But I agree with your larger point and I think it is a valid point. |
|
| |
| ▲ | cosmic_cheese 4 days ago | parent | prev | next [-] | | This system wouldn’t work in place of moderation, but rather alongside it. The two would have an enhancing effect on each other: - Reach limits greatly limit troll effectiveness, since they can’t find each other as easily - Posts that exceed the threshold naturally vs. being trolled past would have different “fingerprints” that could be used like a blacklight for troll detection for both assisting moderators and for model training for automatic suspected troll flagging The threshold should probably be dynamic and set at the point at which posts “breach containment” (escape from their intended audience), which is where problems tend to occur. Bluesky-like self-moderation controls would also help. | | |
| ▲ | 1234letshaveatw 4 days ago | parent [-] | | moderation inevitably leads to exclusion- just look at the US state specific subreddits that are moderated by radicals who prohibit even the slightest deviation from their views which silences dissent. This one-sided viewpoint is then slurped up and used to train AI models in a kind of gross feedback loop | | |
| ▲ | cosmic_cheese 4 days ago | parent | next [-] | | Reddit’s fatal flaw is that subreddit mods are volunteers. Sometimes this works well when you get a knowledgable, benevolent individual in the position, but more often than not you get people who want to power trip. Mods should be in-house, on payroll, and strictly bound to the network’s standards. This should generally be less of an issue anyway in a system that actively penalizes the sorts of crudely expressed, un-nuanced posts that are typically social media’s bread and butter. Not being able to appeal to basal emotions (“it feels right” is a poor metric) and being required to substantiate views more intelligently takes the air out of a lot of fringe sails. | |
| ▲ | rectang 3 days ago | parent | prev [-] | | > moderation inevitably leads to exclusion Yes, it has to, because trolls and haters are relentless. The choice of whether or not marginalized communities are allowed to ban abusive posters is fundamental because moderation resources are finite. Of course, there are many who believe that marginalized communities should not be allowed to moderate posts and should be willing to absorb a constant onslaught of abuse as the price of existing. | | |
| ▲ | 1234letshaveatw 3 days ago | parent [-] | | One person's "trolls and haters" is another's dissenter. It is disturbing how quickly marginalized communities become echo chambers | | |
| ▲ | cosmic_cheese 3 days ago | parent | next [-] | | Nah, much of the time a trolls and haters are just trolls and haters. Dissenters who want to taken seriously generally aren’t easily mistakable as trolls because they’re not there to try to get a rise out of the opposing side. One also needs to keep in mind for spaces for marginalized in particular are more sensitive than typical because they have to keep their shields up at all times because of how much more likely attacks are. If making them feel like they can safely drop their shields is a goal, then incidents like people posting in an antagonistic and/or harassing manner needs to drop to background radiation levels. | | |
| ▲ | immibis 2 days ago | parent [-] | | Dissenters nonetheless get treated by the same system that treats trolls and haters, by the people they're dissenting from. If there's a system that site moderators can use to globally block trolls and haters, they also use it to block their dissenters. Every time. If there's a system where you can stop me replying to you, it'll get used by people selling snake oil to block people from replying saying "hey this is actually snake oil". Every time. | | |
| ▲ | rectang 2 days ago | parent [-] | | And so the "solution" is to give dissenters free rein and thus trolls and haters free rein as well, effectively deplatforming all but the most pugilistic amongst marginalized communities as those who don't want to spend their lives fighting flee the torrent of abuse. Whether you intend it or not, that's a fabulous scenario for both active haters and those who quietly prefer that the marginalized not exist. But even after they flee, it won't be enough — the new platform where they took refuge, having become popular, must be stormed. Any room where the marginalized congregate must be filled with the din of hatred. In short, it is not enough for Musk-era Twitter to be Twitter — BlueSky must also become Twitter. | | |
| ▲ | immibis 2 days ago | parent [-] | | There might be no good solution. In that case, you can still implement the least bad solution, but it's better to know that it's merely the least bad, than to fool yourself into thinking it's good - in particular, knowing that it's not a good solution should make you more open to hearing other ideas. BlueSky is already quite bad like old Twitter, though not like X. It's not filled with Nazis, but it is very bland and corporate with no substance. |
|
|
| |
| ▲ | 3 days ago | parent | prev [-] | | [deleted] |
|
|
|
| |
| ▲ | packetlost 4 days ago | parent | prev [-] | | As always, there's a balance. Communities (and individuals) generally need the ability to moderate and manage access to both membership and interactions with the community. Algorithmic-driven open platforms are sorta mutually incompatible with that idea |
|