▲ | danaris 8 days ago | ||||||||||||||||
> I think it’s that people tend to build up “logical” conclusions where they think each step is a watertight necessity that follows inevitably from its antecedents, but actually each step is a little bit leaky, leading to runaway growth in false confidence. Yeah, this is a pattern I've seen a lot of recently—especially in discussions about LLMs and the supposed inevitability of AGI (and the Singularity). This is a good description of it. | |||||||||||||||||
▲ | kergonath 8 days ago | parent | next [-] | ||||||||||||||||
Another annoying one is the simulation theory group. They know just enough about Physics to build sophisticated mental constructs without understanding how flimsy the foundations are or how their logical steps are actually unproven hypotheses. | |||||||||||||||||
| |||||||||||||||||
▲ | spopejoy 7 days ago | parent | prev [-] | ||||||||||||||||
You might have just explained the phenomenon of AI doomsayers overlapping with ea/rat types, which I otherwise found inexplicable. EA/Rs seem kind of appalingly positivist otherwise. | |||||||||||||||||
|