Remix.run Logo
jplusequalt 9 hours ago

I don't understand the psychology of doomerism. Are people truly so scared of these futures they are incapable of imagining an alternate path where anything less than total human extinction occurs?

Like if you're truly afraid of this, what are you doing here on HN? Go organize and try to do something about this.

542354234235 7 hours ago | parent [-]

I don’t see it as doomerism, just realism. Looking at the realities of nuclear war shows that it is a world ending holocaust that could happen by accident or by the launch of a single nuclear ICBM by North Korea, and there is almost no chance of de-escalation once a missile is in the air. There is nothing to be done, other than advocate of nuclear arms treaties in my own country, but that has no effect on Russia, China, North Korea, Pakistan, India, or Iran. Bertrand Russell said, "You may reasonably expect a man to walk a tightrope safely for ten minutes; it would be unreasonable to do so without accident for two hundred years." We will either walk the tightrope for another 100 years or so until global society progresses to where there is nuclear disarmament, or we won’t.

It is the same with Gen AI. We will either find a way to control an entity that rapidly becomes orders of magnitude more intelligent than us, or we won’t. We will either find a way to prevent the rich and powerful from controlling a Gen AI that can build and operate anything they need, including an army to protect them from everyone without a powerful Gen AI, or we won’t.

I hope for a future of abundance for all, brought to us by technology. But I understand that some existential threats only need to turn the wrong way once, and there will be no second chance ever.

jplusequalt 7 hours ago | parent [-]

I think it's a fallacy to equate pessimistic outcomes with "realism"

>It is the same with Gen AI. We will either find a way to control an entity that rapidly becomes orders of magnitude more intelligent than us, or we won’t. We will either find a way to prevent the rich and powerful from controlling a Gen AI that can build and operate anything they need, including an army to protect them from everyone without a powerful Gen AI, or we won’t

Okay, you've laid out two paths here. What are *you* doing to influence the course we take? That's my point. Enumerating all the possible ways humanity faces extinction is nothing more than doomerism if you aren't taking any meaningful steps to lessen the likelihood any of them may occur.