| ▲ | nuancebydefault 11 hours ago | |||||||||||||||||||||||||||||||
The article discusses basically 2 new problems with using agentic AI: - When one of the agents does something wrong, a human operator needs to be able to intervene quickly and needs to provide the agent with expert instructions. However since experts do not execute the bare tasks anymore, they forget parts of their expertise quickly. This means the experts need constant training, hence they will have little time left to oversee the agent's work. - Experts must become managers of agentic systems, a role which they are not familiar with, hence they are not feeling at home in their job. This problem is harder to be determined as a problem by people managers (of the experts) since they don't experience that problem often first hand. Indeed the irony is that AI provides efficiency gains, which as they become more widely adopted, become more problematic because they outfit the necessary human in the loop. I think this all means that automation is not taking away everyone's job, as it makes things more complicated and hence humans can still compete. | ||||||||||||||||||||||||||||||||
| ▲ | grvdrm 9 hours ago | parent | next [-] | |||||||||||||||||||||||||||||||
Your first problem doesn’t feel new at all. Reminded me of a situation several years ago. What was previous Excel report was automated into PowerBI. Great right? Time saved. Etc. But the report was very wrong for months. Maybe longer. And since it was automated, the instinct to check and validate was gone. And tracking down the problem required extra work that hadn’t been part of the Excel flow I use this example in all of my automation conversations to remind people to be thoughtful about where and when they automate. | ||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||
| ▲ | asielen 10 hours ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||
The way you put that makes be think of the current challenge younger generations are having with technology in general. Kids who were raised on touch screen interfaces vs kids in older generations who were raised on computers that required more technical skill to figure out. In the same way, when everything just works, there will be no difference, but when something goes wrong, the person who learned the skills before will have a distinct advantage. The question is if AI gets good enough that slowing down occasionally to find a specialist is tenable. It doesn't need to be perfect, it just needs to be predicably not perfect. Expertw will always be needed, but they may be more like car mechanics, there to fix hopefully rare issues and provide a tune up, rather than building the cars themselves. | ||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||
| ▲ | delaminator 10 hours ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||
I used to be a maintenance data analyst in a welding plant welding about 1 million units per month. I was the only person in the factory who was a qualified welder. | ||||||||||||||||||||||||||||||||
| ▲ | layer8 4 hours ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||
They also made the point that the less frequent failures become, the more tedious it is for the human operator to check for them, giving the example of AI agents providing verbose plans of what they intend to do that are mostly fine, but will occasionally contain critical failures that the operator is supposed to catch. | ||||||||||||||||||||||||||||||||
| ▲ | DiscourseFan 11 hours ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||
That's how it tends to go, automation removes some parts of the work but creates more complexity. Sooner or later that will also be automated away, and so on and so forth. AGI evangelists ought to read Marx's Capital. | ||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||
| ▲ | jennyholzer 11 hours ago | parent | prev [-] | |||||||||||||||||||||||||||||||
[dead] | ||||||||||||||||||||||||||||||||