| ▲ | TimorousBestie 11 hours ago | |
> . . . but also what’s called long-termism, which is worrying about the future of the planet and existential risks like pandemics, nuclear war, AI, or being hit by comets. When it made that shift, it began to attract a lot of Silicon Valley types, who may not have been so dedicated to the development part of the effective altruism program. The rationalists thought they understood time discounting and thought they could correct for it. They were wrong. Then the internal contradictions of long-termism allowed EA to get suckered by the Silicon Valley crew. Alas. | ||