Remix.run Logo
rectang 4 days ago

> stymy the flywheel effect that allows a handful of users (and thus sets of norms) to come to dominate so strongly

This prevents certain communities from forming and certain topics from being discussed. For example, you can't discuss LGBTQ issues with troll armies constantly swarming and spamming. If such communities are not given tools to exclude malignant disruptors by setting norms and "dominating" a given channel, they will have to go elsewhere (such as leaving X for BlueSky).

Levitz 3 days ago | parent | next [-]

Problem is that then the communities do form, but automatically radicalize. Truth Social and Bluesky users are in similar bubbles, just in opposite sides of the spectrum.

rectang 3 days ago | parent [-]

So what? Should they not exist? Why must marginalized communities leave themselves defenseless and accept that they can only have a conversation among themselves in the midst of a hurricane of abuse?

Karrot_Kream 3 days ago | parent | next [-]

Personally I think media like Bluesky are not suited to form these forms of closed communities. There's nothing wrong with closed/gated community and I think it's especially important for marginalized communities as you mention, but I think in 2025 you could do that with a Discord "server", Discourse forum, or a non-federated Lemmy instance.

The problem with folks like this on Bluesky and X is that they want to both have a closed community but also benefit from the easy comings and goings that a more open forum offers. IMO it's a fools errand. There's a reason why the humble group chat has won as the social media of choice for, well, everyone.

simianwords 3 days ago | parent | prev [-]

Because they can get so dominant as to alienate normal users. Like in Reddit.

But I agree with your larger point and I think it is a valid point.

cosmic_cheese 4 days ago | parent | prev | next [-]

This system wouldn’t work in place of moderation, but rather alongside it. The two would have an enhancing effect on each other:

- Reach limits greatly limit troll effectiveness, since they can’t find each other as easily

- Posts that exceed the threshold naturally vs. being trolled past would have different “fingerprints” that could be used like a blacklight for troll detection for both assisting moderators and for model training for automatic suspected troll flagging

The threshold should probably be dynamic and set at the point at which posts “breach containment” (escape from their intended audience), which is where problems tend to occur.

Bluesky-like self-moderation controls would also help.

1234letshaveatw 4 days ago | parent [-]

moderation inevitably leads to exclusion- just look at the US state specific subreddits that are moderated by radicals who prohibit even the slightest deviation from their views which silences dissent. This one-sided viewpoint is then slurped up and used to train AI models in a kind of gross feedback loop

cosmic_cheese 4 days ago | parent | next [-]

Reddit’s fatal flaw is that subreddit mods are volunteers. Sometimes this works well when you get a knowledgable, benevolent individual in the position, but more often than not you get people who want to power trip.

Mods should be in-house, on payroll, and strictly bound to the network’s standards.

This should generally be less of an issue anyway in a system that actively penalizes the sorts of crudely expressed, un-nuanced posts that are typically social media’s bread and butter. Not being able to appeal to basal emotions (“it feels right” is a poor metric) and being required to substantiate views more intelligently takes the air out of a lot of fringe sails.

rectang 3 days ago | parent | prev [-]

> moderation inevitably leads to exclusion

Yes, it has to, because trolls and haters are relentless. The choice of whether or not marginalized communities are allowed to ban abusive posters is fundamental because moderation resources are finite.

Of course, there are many who believe that marginalized communities should not be allowed to moderate posts and should be willing to absorb a constant onslaught of abuse as the price of existing.

1234letshaveatw 3 days ago | parent [-]

One person's "trolls and haters" is another's dissenter. It is disturbing how quickly marginalized communities become echo chambers

cosmic_cheese 3 days ago | parent | next [-]

Nah, much of the time a trolls and haters are just trolls and haters. Dissenters who want to taken seriously generally aren’t easily mistakable as trolls because they’re not there to try to get a rise out of the opposing side.

One also needs to keep in mind for spaces for marginalized in particular are more sensitive than typical because they have to keep their shields up at all times because of how much more likely attacks are. If making them feel like they can safely drop their shields is a goal, then incidents like people posting in an antagonistic and/or harassing manner needs to drop to background radiation levels.

immibis 2 days ago | parent [-]

Dissenters nonetheless get treated by the same system that treats trolls and haters, by the people they're dissenting from. If there's a system that site moderators can use to globally block trolls and haters, they also use it to block their dissenters. Every time. If there's a system where you can stop me replying to you, it'll get used by people selling snake oil to block people from replying saying "hey this is actually snake oil". Every time.

rectang 2 days ago | parent [-]

And so the "solution" is to give dissenters free rein and thus trolls and haters free rein as well, effectively deplatforming all but the most pugilistic amongst marginalized communities as those who don't want to spend their lives fighting flee the torrent of abuse. Whether you intend it or not, that's a fabulous scenario for both active haters and those who quietly prefer that the marginalized not exist.

But even after they flee, it won't be enough — the new platform where they took refuge, having become popular, must be stormed. Any room where the marginalized congregate must be filled with the din of hatred. In short, it is not enough for Musk-era Twitter to be Twitter — BlueSky must also become Twitter.

immibis 2 days ago | parent [-]

There might be no good solution. In that case, you can still implement the least bad solution, but it's better to know that it's merely the least bad, than to fool yourself into thinking it's good - in particular, knowing that it's not a good solution should make you more open to hearing other ideas.

BlueSky is already quite bad like old Twitter, though not like X. It's not filled with Nazis, but it is very bland and corporate with no substance.

3 days ago | parent | prev [-]
[deleted]
packetlost 4 days ago | parent | prev [-]

As always, there's a balance. Communities (and individuals) generally need the ability to moderate and manage access to both membership and interactions with the community. Algorithmic-driven open platforms are sorta mutually incompatible with that idea