▲ | ants_everywhere 4 days ago | |||||||
Specifically about autism, I don't. There is an academic literature on trolling and social media, which you can find on google scholar or talking to ChatGPT or Gemini for introduction points. The papers I've read haven't been outstanding, but it's better than nothing. I thought about building tools to track it on Reddit, but with the API changes most of the existing tools have been shut down. There also used to be sites that tracked foreign influence activity but they've mostly stopped from what I can tell. I did use some of those tools to track inorganic activity in other forums (not autistic spaces at the time) and got a feel for what inorganic activity looked like. Then when I saw the changes in autistic spaces I was able to see the patterns I had already seen elsewhere. On Reddit at least, what usually happens is trolls try to become moderators. Or, failing that, they complain about moderators and fork the subreddit to a new sub they can moderate. Typically they'll show up as unproblematic power users for a few months before it becomes clear they're trolls. Once they have moderation powers it's basically over. At any rate, with LLMs it's impossible to track now. Your best bet if you're interested is to study how it works in known cases and then use your own judgment to decide if what you're seeing matches that pattern. | ||||||||
▲ | Karrot_Kream 4 days ago | parent [-] | |||||||
You should totally write up what you were able to get. It's always helpful to understand how these kinds of influence campaigns start. At the very least researchers can build models off older insights even though places like Reddit are now closed off. | ||||||||
|