Remix.run Logo
salawat 10 hours ago

Except for the fact that that very accountability sink is relied on by senior management/CxO's the world over. The only difference is that before AI, it was the middle manager's fault. We didn't tell anyone to break the law. We just put in place incentive structures that require it, and play coy, then let anticipatory obedience do the rest. Bingo. Accountability severed. You can't prove I said it in a court of law, and skeevy shit gets done because some poor bloke down the ladder is afraid of getting fired if he doesn't pull out all the stops to meet productivity quotas.

AI is just better because no one can actually explain why the thing does what it does. Perfect management scapegoat without strict liability being made explicit in law.

pixl97 8 hours ago | parent | next [-]

Hence why many life and death things require licencing and compliance, and tend to come with very long paper trails.

The software world has been very allergic to getting anywhere near the vicinity of a system like that.

salawat 8 hours ago | parent [-]

Did I give the impression that the phenomena was unique to software? Hell, Boeing was a shining example of the principle in action with 737 MAX. Don't get much more "people live and die by us, and we know it (but management set up the culture and incentives to make a deathtrap anyway)." No one to blame of course. These things just happen.

Licensure alone doesn't solve all these ills. And for that matter, once regulatory capture happens, it has a tendency to make things worse due to consolidation pressure.

Muromec 6 hours ago | parent | prev [-]

>AI is just better because no one can actually explain why the thing does what it does. Perfect management scapegoat without strict liability being made explicit in law.

AI is worse in that regard, because, although you can't explain why it does so, you can point a finger at it, say "we told you so" and provide the receipts of repeated warnings that the thing has a tendency of doing the things.