▲ | zifpanachr23 4 days ago | |||||||||||||||||||||||||
This is probably related to cynicism (but also maybe wisdom from a life that at times was not easy) I've developed over the years, but I can't help but get seriously dystopian vibes from your post. More than the usual boosters, because it seems to be a little more honest and a little less cognizant of what I think the real moral hangups are that people tend to have with AI. So you seem genuine in some sense in a way many others aren't. I don't for one second really suspect it will be the case (not for the usual technical criticisms although I'm skeptical there as well, but more that I don't think it would be socially sustainable for an extended period of time)...but let's for a moment take your last paragraph at face value and in good faith. I mean...what exactly is it that you are advocating for or accepting? Even if we get some kind of very generous UBI, there's something about human nature that makes me suspect the consequences of this would be an almost guaranteed miserable existence for pretty much everyone. Even in the best case scenario, where the results of this transformation are kept under control and distributed in a reasonable manner and the whole thing doesn't cause a social and political meltdown...what is everybody going to do? There's some amount of wisdom in the old saying that "idle hands are the devil's plaything". Thats the real issue I am the most concerned about and that seems to be the least often addressed by big AI boosters and detractors (I realize both of these camps often have ulterior motives). I suspect many are feeling some amount of concern like that...why is this (I would argue most fundamental) question about the impact of AI never talked about? I don't want to hear anything about some big terminator style fight against AI or about how wonderful and unpredictable the inevitable future of WALL-E style luxury gay space communism is going to be...none of those discussion points get to the heart of what makes many people so uncomfortable with the concept...and I think the people believing in some version of that second scenario being at all socially plausible is what gives me the most pause. It makes the terminator scenario almost seem like a preferred outcome if we were given a binary choice...in reality I think most would prefer neither and would agree with me in saying that we aren't even discussing the right issues w.r.t an "AI gets much better" potential future. | ||||||||||||||||||||||||||
▲ | mg 4 days ago | parent [-] | |||||||||||||||||||||||||
I'm not advocating, just trying to look into the future. Superhuman AI seems to be a building block of it. And deep transformation of how we work will come with it. So I raised the question of the timeline. That's all. | ||||||||||||||||||||||||||
|