Remix.run Logo
ants_everywhere 5 days ago

unlikely.

A more likely explanation is that pro-violence propaganda began swamping social media in 2016, which is 9 years ago. 18 year olds have been exposed to it nonstop since they were 9 and 34 year olds since they were 25.

The people who are disposed to anger and violence move along the radicalization sales funnel relatively slowly. But already once you've shown interest, you start seeing increasingly angry content and only angry content. There is a lot of rhetoric specifically telling people they should be angry, should not try to help things, and should resort to violence, and actively get others to promote violence.

Being surrounded socially by that day in and day out is a challenge to anyone, and if you're predisposed to anger it can become intoxicating.

A lot of people want to say marketing doesn't work or that filter bubbles don't matter. But the bare facts are that we've had nearly a decade of multiple military intelligence agencies running nonstop campaigns promoting violent ideology in the US. And it would be naive to think that didn't make a difference.

The same sort of campaigns were run at a smaller scale during the Cold War and have been successful in provoking hot wars.

voidhorse 5 days ago | parent | next [-]

I think you're right. Couple it with the increasing isolation driven by everyone being online 24/7 in lieu of interacting with each other in person and you have a recipe for disaster. Even though it's possible to be social on the internet, it has a strong distance effect and a lot of groups benefit by forging internet bonds over hatred, criticism, or dehumanization of others (who cares about the "normies"). In addition, in many cases one doesn't even need to interact with people for most needs (amazon etc) further contributing to isolation and the illusion that you don't need others. It's the perfect storm to make the barrier to violence really low—it's easy when you have no connection to the victims and you see them as less than human or as objects "npcs".

ants_everywhere 5 days ago | parent [-]

Your mention of "normies" and "npcs" reminds me of an unfortunate change I saw happen in autistic communities a few years ago.

Those spaces used to be great places for people to ask questions, share interests, and find relief in a community that understood them. But over just a year or two, the whole atmosphere flipped. The focus turned from mutual support to a shared antagonism toward neurotypical people, who were often dehumanized.

It was heartbreaking to watch. Long-time members, people who were just grateful to finally have a place to belong, were suddenly told they weren't welcome anymore if they weren't angry enough. That anger became a tool to police the community, and many of the original, supportive spaces were lost.

collingreen 5 days ago | parent [-]

I am not in these spaces so it's nice to get your summary. I agree that is tragic.

I've wondered about this kind of shift being an inevitable response to the growing online trope of autism being the boogeyman used to shill everything from not getting vaccinated to making your kids drink your urine.

The head of us health regularly talks about autistic people as a terrible tragedy inflicted on their parents and a net negative to society. I expect that kind of rhetoric would fuel hostility across any group.

ants_everywhere 5 days ago | parent [-]

I don't know that but it predates the current head of US health being a major public figure.

At the time I did some data analysis on the usernames of people promoting these ideas. Before the Reddit API changes you could get statistics on subs that had an overlap of users. What I noticed was there was an overlap with fringe political subs. The autistic subs with more anger issues had more fringe political people in it and as the subs became angrier the overlap increased. Inevitably the most vocal and pushy angry people were active in those political subs. You can see similar things with the angrier comments on HN.

I don't think it's an inevitable response to the things you mention. But it may be related. For example there's the term "weaponized autism" [e.g. 0]. That is, politically fringe and extreme groups talk and joke regularly about weaponizing autistic people as trolls. I think the autism forums became part of the recruiting funnel for this sort of extremism. At least that's the hypothesis that seemed to best explain all the factors.

[0] https://pubmed.ncbi.nlm.nih.gov/35947316/ # I don't know if this paper or journal are any good. It's just the top hit that seemed relevant. One of the authors is Simon Baron Cohen, a well known autism researcher.

Karrot_Kream 4 days ago | parent [-]

I'm very sympathetic to this as well but I'm curious if you know any leads on research investigating this area as I hesitate to draw a conclusion with a feeling. I participate in a lot of hobbies that have autistic folks in it and I watched the same anger spread into those communities along with the predictable good-vs-evil rhetoric that autistic folks tend to fall into.

ants_everywhere 4 days ago | parent [-]

Specifically about autism, I don't. There is an academic literature on trolling and social media, which you can find on google scholar or talking to ChatGPT or Gemini for introduction points. The papers I've read haven't been outstanding, but it's better than nothing.

I thought about building tools to track it on Reddit, but with the API changes most of the existing tools have been shut down.

There also used to be sites that tracked foreign influence activity but they've mostly stopped from what I can tell.

I did use some of those tools to track inorganic activity in other forums (not autistic spaces at the time) and got a feel for what inorganic activity looked like. Then when I saw the changes in autistic spaces I was able to see the patterns I had already seen elsewhere.

On Reddit at least, what usually happens is trolls try to become moderators. Or, failing that, they complain about moderators and fork the subreddit to a new sub they can moderate. Typically they'll show up as unproblematic power users for a few months before it becomes clear they're trolls. Once they have moderation powers it's basically over.

At any rate, with LLMs it's impossible to track now. Your best bet if you're interested is to study how it works in known cases and then use your own judgment to decide if what you're seeing matches that pattern.

Karrot_Kream 4 days ago | parent [-]

You should totally write up what you were able to get. It's always helpful to understand how these kinds of influence campaigns start.

At the very least researchers can build models off older insights even though places like Reddit are now closed off.

ants_everywhere 4 days ago | parent [-]

thanks for the suggestion, I am planning to at some point. or at possibly make a video about it.

mothballed 5 days ago | parent | prev [-]

>A lot of people want to say marketing doesn't work or that filter bubbles don't matter. But the bare facts are that we've had nearly a decade of multiple military intelligence agencies running nonstop campaigns promoting violent ideology in the US. And it would be naive to think that didn't make a difference.

Hmm, interesting thesis. I'm aware something like half of the Whitmer Kidnapping plotters were feds/informants, to the point a few were exonerated in trial. There's certainly some evidence the government is intentionally provoking violent actors.

ethbr1 5 days ago | parent | next [-]

I believe parent was referring to the US government and other national governments.

It's on record that Russian and Chinese propaganda campaigns in the US were aimed at sowing division generally, more so than any particular viewpoint.

ants_everywhere 5 days ago | parent [-]

Yes that's correct. In particular, not just run of the mill division, but impersonating right and left wing militants both calling for violence.

For example, just one that turned up at the top of a quick Google search

> And the analysis shows that everyone from the former president, Dmitry Medvedev, as well as military bloggers, lifestyle influencers and bots, as you mentioned, are all pushing this narrative that the U.S. is on the brink of civil war and thus Texas should secede from the United States, and that Russia will be there to support this.

https://www.kut.org/texasstandard/2024-02-14/russian-propaga...

throwaway48476 5 days ago | parent | prev [-]

Government employees are just trying to get promoted. So they entrap crazy people that they can then stop.