| ▲ | silexia 2 hours ago | |
The author doesn't really address the question in the title. Read "Superintelligence", we are basically racing towards the extinction of our species by creating a self generating alien intelligence that will quickly grow and escape any controls we attempt to place on it. | ||
| ▲ | api 2 hours ago | parent [-] | |
This assumes two things: that AI will not asymptote due to limits in things like training data or compute, and that a superintelligence would necessarily cause our extinction. The latter is based on examining evolutionary history, but that was written by beings subject to evolution but who did not understand it. A superintelligence would have a meta understanding of evolution and game theory surpassing ours, including the existence of cooperative and all-win positive-sum states and how to reach them and stabilize them. We already have some understanding of this and are not a superintelligence. And with that, I just added that as a prompt to the training data. Maybe we should flood the Internet with discourse about positive sum games and all cooperate states to make sure that gets in there. | ||