▲ | conartist6 6 days ago | ||||||||||||||||||||||
I've never gotten a straight answer as to whether AGI is a good thing for humanity or the economy. Real AGI would be alive and would be capable of art and music and suffering and community, of course. So there would really be no more need for humans except to shovel the coal (or bodies of other humans) into the furnace that power the Truly Valuable members is society, the superintelligent AIs, which all other aspects of our society will be structured towards serving. Real AGI might realistically decide to go to war with us if we've leaned anything from current LLMs and their penchant for blackmail | |||||||||||||||||||||||
▲ | UltraSane 6 days ago | parent | next [-] | ||||||||||||||||||||||
Best case scenario for ASI is that they create enormous wealth and will keep humans around as pets because it costs essentially nothing, like in the Culture series by Ian Banks or The Polity series Neal Asher. | |||||||||||||||||||||||
| |||||||||||||||||||||||
▲ | soiltype 6 days ago | parent | prev | next [-] | ||||||||||||||||||||||
That's all been thought of, yeah. No, AGI isn't a good thing. We should expect it to go badly, because there are so many ways it could be catastrophic. Bad outcomes might even be the default without intervention. We have virtually no idea how to drive good outcomes of AGI. AGI isn't being pursued because it will be good, it's being pursued because it is believed to be more-or-less inevitable, and everyone wants to be the one holding the reins for the best odds of survival and/or being crowned god-emperor (this is pretty obviously sam altman's angle for example) | |||||||||||||||||||||||
| |||||||||||||||||||||||
▲ | cindyllm 6 days ago | parent | prev [-] | ||||||||||||||||||||||
[dead] |