| ▲ | geremiiah 3 hours ago | |
Besides the argument above, that an AGI powerful enough to replace 99.999% of humanity won't be controllable, there's also the economic argument: corporations, executives, all that means nothing if 99.999% are unemployed. Our economy is based on consumerism which will obviously cease to happen in a scenario where 99.999% of humanity is unemployed. The economic system would be so upended that ownership and such notions would become immaterial. | ||
| ▲ | acdha an hour ago | parent | next [-] | |
I would worry that it won’t go quickly to 99.999% but instead would grind down different groups of people slowly enough that they’d be able to entrench their power: being a cop will be a growth job, people would be given state-sanctioned automation-resistant work like picking crops as a condition of receiving social benefits, the Republicans would more seriously dust off the previously-fringe proposals to restrict voting to property owners again, etc. Setting people against each other is a time honored way for a small elite to control a large population. | ||
| ▲ | caconym_ an hour ago | parent | prev [-] | |
If we meet in the post-apocalyptic wasteland, but I have an android slave with a gun and you have nothing but a rusty spoon, it's going to be pretty clear who the android belongs to, and who it serves. The android also makes it likely that I will have a bunch of other nice stuff that you don't. Food and water, for instance. This scenario is not meant to be taken literally. | ||