▲ | Karrot_Kream 4 days ago | ||||||||||||||||
I'm very sympathetic to this as well but I'm curious if you know any leads on research investigating this area as I hesitate to draw a conclusion with a feeling. I participate in a lot of hobbies that have autistic folks in it and I watched the same anger spread into those communities along with the predictable good-vs-evil rhetoric that autistic folks tend to fall into. | |||||||||||||||||
▲ | ants_everywhere 4 days ago | parent [-] | ||||||||||||||||
Specifically about autism, I don't. There is an academic literature on trolling and social media, which you can find on google scholar or talking to ChatGPT or Gemini for introduction points. The papers I've read haven't been outstanding, but it's better than nothing. I thought about building tools to track it on Reddit, but with the API changes most of the existing tools have been shut down. There also used to be sites that tracked foreign influence activity but they've mostly stopped from what I can tell. I did use some of those tools to track inorganic activity in other forums (not autistic spaces at the time) and got a feel for what inorganic activity looked like. Then when I saw the changes in autistic spaces I was able to see the patterns I had already seen elsewhere. On Reddit at least, what usually happens is trolls try to become moderators. Or, failing that, they complain about moderators and fork the subreddit to a new sub they can moderate. Typically they'll show up as unproblematic power users for a few months before it becomes clear they're trolls. Once they have moderation powers it's basically over. At any rate, with LLMs it's impossible to track now. Your best bet if you're interested is to study how it works in known cases and then use your own judgment to decide if what you're seeing matches that pattern. | |||||||||||||||||
|