▲ | hbarka 2 days ago | |||||||||||||||||||||||||||||||
Full anonymity in social media should not be allowed. It becomes a cover for bad actors (propagandists, agents, disinformation, bots, age-inappropriate, etc.) It doesn’t have to be a full identity, but knowing your user metadata is open during interactions can instill a sense of responsibility and consequence of social action. As in real life. | ||||||||||||||||||||||||||||||||
▲ | creata 2 days ago | parent | next [-] | |||||||||||||||||||||||||||||||
People should be able to say things without those things following them around for the rest of their lives. > As in real life. No, your proposal is very different to real life. In real life, the things you say will eventually be forgotten. You won't be fired for things you said or did years ago, because people will have moved on. Having a convenient index of everything anyone has ever shared is very different to real life. | ||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||
▲ | makeitdouble 2 days ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||
Real life needs full anonymity too. Not everywhere, but it's critical to have some. For instance a political vote needs to be anonymous. Access to public space typically is (you're not required to identify to walk the street) even if that anonymity can be lifted etc. Real life is complex, and for good reasons, if we want to take it as a model we should integrate it's full complexity as well. | ||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||
▲ | idle_zealot 2 days ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||
Looking at any random fullrealname Facebook account will disabuse you of this notion. People will tie vile shit to their identities without a second thought. Rather than sacrifice the cover that anonymity grants vulnerable people, journalists, and activists, I think we should come at this issue by placing restrictions on how social media platforms direct people to information. The impulse to restrict and censor individuals rather than restrict powerful organizations profiting from algorithmic promotion of the content you deem harmful is deeply troubling. The first step here is simple: identify social media platforms over some size threshold, and require that any content promotion or algorithmic feed mechanism they use is dead-simple to understand and doesn't target individuals. That avoids the radicalization rabbithole problem. Make the system trivial and auditable. If they fail the audit then they're not allowed to have any recommendation system for a year. Just follows and a linear feed (sorting and filtering are allowed so long as they're exposed to the user). To reiterate: none of this applies if you're below some user cutoff. Q: Will this kill innovation in social media? A: What fucking innovation? | ||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||
▲ | krapp 2 days ago | parent | prev [-] | |||||||||||||||||||||||||||||||
Kiwifarms is an obvious object lesson in why anonymity online is necessary, and hardly the only one. | ||||||||||||||||||||||||||||||||
|