| ▲ | cameldrv 4 hours ago | |
Humans have driven innumerable species to extinction without even really trying, they were just in the way of something else we wanted. I can pretty easily think of a number of ways an AI with a lot of resources at its disposal could wipe out humanity with current technology. Honestly we require quite a bit of food and water daily, can't hibernate/go dormant, and are fairly large and easy to detect. Beyond that, very few living people still know truly how to live off the land. We generally require very long supply chains for survival. I don't see why being able to do this would necessitate being able to cure all diseases or a comparable good outcome. | ||
| ▲ | plastic-enjoyer 3 hours ago | parent [-] | |
> I don't see why being able to do this would necessitate being able to cure all diseases or a comparable good outcome. Yes, but neither do I see why an AGI should do the opposite. The arguments about an AGI that drives us to extinction do sound like projection to me. People extrapolate from human behaviour how a superintelligence will behave, assuming that what seems rational to us is also rational to AI. A lot of the described scenarios of malicious AI do more read like a natural history of human behaviour. | ||