| ▲ | engeljohnb 2 hours ago | |
Interns are human. Humans can always be held accountable. A computer never can. Therefore, no one should leave a computer in charge of human decisions. | ||
| ▲ | edot 2 hours ago | parent [-] | |
Exactly. Thus the blame when an LLM does something dumb should fall on the human who owns the implementation of said LLM. A dead simple example: if I paste confidential information into ChatGPT, that’s on me. If I let Codex have access to an environment where it can get to confidential information, that’s also on me. At best I could also blame my IT department for giving me technical permissions to do such a thing, but still it’s humans at fault (and I believe in taking Extreme Ownership, so I wouldn’t even do that). LLMs are just technology like any other. | ||