▲ | jononor 3 days ago | |
One of the ways it can work is that people may reject the most extreme case (say end of humans beings in control), but accept some milder version (most jobs will be done by AI, or most jobs will be AI assisted). The fear of an extreme can cause people to rationalize the "milder" outcome - independently of whether there are good arguments for that outcome, or even whether the outcome is desirable, or better than status quo. The investor class are not dependent on wages, so their livelihoods are not at stake. Same with big corporate partners, they are hoping to improve competitiveness by having fewer employees and CEO take a bonus for that. Regular users in fear of their jobs may act on that, in hope that they can reskill and transition by being AI savvy. I do not agree with the argument they make, but I understand what they are are playing at, and that unfortunately it can be effective. |