Remix.run Logo
lazide 10 hours ago

Even ‘true general intelligence’ (if we count humans as that) screws up frequently, sometimes (often?) intentionally for it’s own benefit - which is why accountability is such a necessary element.

If someone won’t be held liable for the end result at some point, then there is no reason to ensure an even somewhat reasonable end result. It’s fundamental.

Which is also why I suspect so many companies are pushing ‘AI’ so hard - to be able to do unreasonable things while having a smokescreen to avoid being penalized for the consequences.

hypeatei 10 hours ago | parent [-]

> to be able to do unreasonable things while having a smokescreen

Maybe, but I feel like the calculus remains unchanged for professions that already lack accountability (police, military, C-suite, three letter agencies, etc.); LLMs are yet another tool in their toolbox to obfuscate but they were going to do that anyway.

Peons will continue to face consequences and sanctions if they screw up by using hallucinated output.

lazide 8 hours ago | parent [-]

all of those professions definitely have accountability - per the nominal rules of the system. Often extremely severe accountability.

The actual systems do everything they can to avoid that accountability, including often violating the rules themselves, or corrupting enforcement, for exactly the reasons why corporations are trying to avoid accountability too.

Accountability is expensive, and way less convenient than doing whatever you want whenever you want.