Remix.run Logo
visarga 8 hours ago

I think you jump from AGI to "human not needed" too abruptly. First of all, AGI might be smarter than you but you have to live with the consequences of using it. So you can't be removed from the loop. We need accountability, AI can't provide it, it has no skin. We need to act well in a local context, AI sits in the datacenter, not on the ground with us. Humans need to bring the context into the AI.

Hinton predicted 9 years ago that radiologists will lose their jobs, yet today they are better paid and more. Maybe AGI will make humans more valuable instead of less. There might be complementarities and mutual reinforcement between us and AGI.

cogman10 8 hours ago | parent [-]

> I think you jump from AGI to "human not needed" too abruptly.

No, I'm really just looking at what this paper proposes will be the future of labor and expanding on it. I'm not saying AGI will mean "humans not needed" I'm saying AGI will mean "less humans are needed" and in some cases that could be significantly less. If you've listened to any CEO gush over AI, you know that's exactly what they want to do.

> Hinton predicted 9 years ago that radiologists will lose their jobs, yet today they are better paid and more. Maybe AGI will make humans more valuable instead of less. There might be complementarities and mutual reinforcement between us and AGI.

Medicine is a tricky place for AI to integrate. Yet it is already integrating there. In particular, basically every health insurance agency is moving towards using AI to auto deny claims. That is a concrete case where companies are happy to live with the consequences even though they are pretty dire for the people they impact.

And, not for nothing 10 years is a pretty short time to completely eliminate an industry. The more we pay for radiologist, the more likely you'll start seeing a hospital decide that "maybe we should just start moving low risk scans to AI". Or you might start seeing remote radiologists for cheap willing to take on the risk of getting things wrong.