| ▲ | hypeatei 2 days ago |
| This is why LLMs won't replace humans wholesale in any profession: you can't hold a machine accountable. Most of the chatbot experiences I have with various support channels always end up with human intervention anyway when it involves money. Maybe true general intelligence would solve these issues, but LLMs aren't meeting that threshold anytime soon, imo. Stochastic parrots won't rule the world. |
|
| ▲ | direwolf20 2 days ago | parent | next [-] |
| This is exactly why LLMs will replace humans: even if the work is crap, nobody will be accountable for the crap work, and it saves money. |
| |
| ▲ | delecti 2 days ago | parent [-] | | Work where "crap" is an acceptable level of quality is work that probably doesn't need to be done. So I think it's more likely that LLMs unravels the "bullshit jobs" entirely, rather than replacing them with crap. Once people realize it didn't matter if the output sucked, they'll realize the output wasn't needed in the first place. |
|
|
| ▲ | lazide 2 days ago | parent | prev [-] |
| Even ‘true general intelligence’ (if we count humans as that) screws up frequently, sometimes (often?) intentionally for it’s own benefit - which is why accountability is such a necessary element. If someone won’t be held liable for the end result at some point, then there is no reason to ensure an even somewhat reasonable end result. It’s fundamental. Which is also why I suspect so many companies are pushing ‘AI’ so hard - to be able to do unreasonable things while having a smokescreen to avoid being penalized for the consequences. |
| |
| ▲ | hypeatei 2 days ago | parent [-] | | > to be able to do unreasonable things while having a smokescreen Maybe, but I feel like the calculus remains unchanged for professions that already lack accountability (police, military, C-suite, three letter agencies, etc.); LLMs are yet another tool in their toolbox to obfuscate but they were going to do that anyway. Peons will continue to face consequences and sanctions if they screw up by using hallucinated output. | | |
| ▲ | lazide 2 days ago | parent [-] | | all of those professions definitely have accountability - per the nominal rules of the system. Often extremely severe accountability. The actual systems do everything they can to avoid that accountability, including often violating the rules themselves, or corrupting enforcement, for exactly the reasons why corporations are trying to avoid accountability too. Accountability is expensive, and way less convenient than doing whatever you want whenever you want. |
|
|