| ▲ | beloch 5 hours ago | |
The more completely fissile material is used up, the higher the explosive yield, so it seems intuitive that fission and fusion bombs should have become cleaner as technology progressed. However, in many cases, even the U.S. has had to play catch-up just to reproduce what they did half a century ago. e.g. Fogbank[1] Delivery vehicles have advanced quite a bit, but the payloads themselves, perhaps not so much. Even if we assume fission and fusion bombs have become completely efficient in using up their fissile materials, there's still the threat of nuclear winter. Nuclear winter has nothing to do with residual radioactivity. Powerful explosions loft fine particulate matter so high into the atmosphere that it takes years or decades to settle. While it's up there, it blocks sunlight and it spreads around the world. If enough bombs explode and enough sunlight is blocked, agriculture fails and the environment collapses globally. Even a completely unopposed unilateral strike, were it large enough, could doom the aggressor to starvation, social breakdown, and civilization collapse. An exchange on the other side of the planet (e.g. between China and India) poses a direct threat to the U.S., the same as every other nation. There are people who will be happy to throw shade on the research on nuclear winter, and AI are no doubt lending them equal weight. However, even if they were just as likely to be right as the research that has highlighted these risks, is the risk worth taking? Are you willing to make that bet? An AI that doesn't reason as humans do and can't do basic math without making mistakes might say, "yes". | ||