▲ | jplusequalt 7 hours ago | |
I think it's a fallacy to equate pessimistic outcomes with "realism" >It is the same with Gen AI. We will either find a way to control an entity that rapidly becomes orders of magnitude more intelligent than us, or we won’t. We will either find a way to prevent the rich and powerful from controlling a Gen AI that can build and operate anything they need, including an army to protect them from everyone without a powerful Gen AI, or we won’t Okay, you've laid out two paths here. What are *you* doing to influence the course we take? That's my point. Enumerating all the possible ways humanity faces extinction is nothing more than doomerism if you aren't taking any meaningful steps to lessen the likelihood any of them may occur. |