▲ | jcranmer 5 days ago | |
Rationalism is essentially a tech-flavored self-help movement, and the people who tend to gravitate towards self-help in general tend to be emotionally vulnerable people who are strongly susceptible to cult techniques (there's a reason so many cults start out as self-help movements). On top of that, given the tech-flavored nature of Rationalism, its adherents seem to gravitate towards strongly utilitarian ethics (evil can be justified if done for a greater good) and an almost messianic relationship towards artificial superintelligence (a good so great it can justify a lot of evil). Finally, it seems to me that Rationalism is especially prone to producing tedious writers which create insularity (by making it impenetrable to non-insiders) and lots of schisms over minor disputes that, due to insularity, end up festering into something rather more cult-like that demands more immediate and drastic action... like the Zizians. | ||
▲ | kelseyfrog 5 days ago | parent [-] | |
To add a little nuance and a bit of a detour from the original topic, some Rationalists (I'm thinking Scott Alexander) tend to spend a lot of brainpower on negative aspects of AI too - think the alignment problem. The category of events having near infinite positive or negative outcomes with zero to few examples where it's difficult to establish a base-rate[prior] appears to attract them the most. Conversely, an imagined demonic relationship with a yet to be realized unaligned AI results in a particular existential paranoia that permeates other enclaves of Rationalist discourse. |