Remix.run Logo
edot 2 hours ago

Exactly. Thus the blame when an LLM does something dumb should fall on the human who owns the implementation of said LLM. A dead simple example: if I paste confidential information into ChatGPT, that’s on me. If I let Codex have access to an environment where it can get to confidential information, that’s also on me. At best I could also blame my IT department for giving me technical permissions to do such a thing, but still it’s humans at fault (and I believe in taking Extreme Ownership, so I wouldn’t even do that). LLMs are just technology like any other.