▲ | lordnacho 2 days ago | |||||||||||||
This is why there are certain jobs AI can never take: we are wired for humans to be responsible. Even though a pilot can do a lot of his work via autopilot, we need a human to be accountable. For the pilot, that means sitting in the plane. But there are plenty of other jobs, mostly high-earning experts, where we need to be able to place responsibility on a person. For those jobs, the upside is that the tool will still be available for the expert to use and capture the benefits from. This lawyer fabricating his filings is going to be among the first in a bunch of related stories: devs who check in code they don't understand, doctors diagnosing people without looking, scientists skipping their experiments, and more. | ||||||||||||||
▲ | unshavedyak 2 days ago | parent | next [-] | |||||||||||||
> This is why there are certain jobs AI can never take You're thinking too linearly imo. Your examples are where AI will "take", just perhaps not entirely replace. Ie if liability is the only thing stopping them from being replaced, what's stopping them from simply assuming more liability? Why can't one lawyer assume the liability of ten lawyers? | ||||||||||||||
| ||||||||||||||
▲ | pjc50 2 days ago | parent | prev [-] | |||||||||||||
The book https://en.wikipedia.org/wiki/The_Unaccountability_Machine introduces the term "accountability sink", which is very useful for these discussions. Increasingly complicated systems generate these voids, where ultimately no human can be singled out or held responsible. AI offers an incredible caveat emptor tradeoff: you can get a lot more done more quickly, so long as you don't care about the quality of the work, and cannot hold anyone responsible for that quality. |