| ▲ | edot 5 hours ago |
| This is why you don’t hire interns! They can delete things and cause havoc! The same people who would blame AI for their failing to properly configure permissions would also blame interns for deleting production whatever. Blame should go up, praise should go down. People always invert these. |
|
| ▲ | stingraycharles 4 hours ago | parent | next [-] |
| > This is why you don’t hire interns! I’d like to rephrase this as: this is why you don’t give interns permissions to delete your prod database. This is a process failure, not an AI failure. I honestly don’t understand why people blame AI here, when you literally gave AI permissions to do exactly this. It’s like blaming AWS for exposing some database to the public. That’s just not AWS’ fault. Neither is this the fault of AI. |
| |
| ▲ | amluto 4 hours ago | parent | next [-] | | There is a major issue with current AI tools that they want to effectively grant access to everything their user has access to. The whole sandbox structure is wrong (although various people have vibe coded assorted improvements). | | |
| ▲ | yonatan8070 3 hours ago | parent | next [-] | | Another issue I've noticed is they're sometimes very resourceful. For example when Codex can't directly edit file due to sandboxing restrictions, rather than asking "hey can I apply this diff on the file", it'd ask for permission to run a `cat EOF` command to re-write the whole file, which the UI doesn't surface properly (just shows the first line...). This sounds similar to what's described in the "Claude deleted my DB post", it decided "I need to do X", then searched for whatever would let it do X, regardless of intended purpose. | | |
| ▲ | amluto 2 hours ago | parent [-] | | I amused myself by removing codex-rs’s web search tool and then asking it to search for “foo”. It wrote a Python script to do the search. |
| |
| ▲ | traderj0e 2 hours ago | parent | prev | next [-] | | If you pretend you have an intern with their own machine and run the AI agents on that machine, you have the same separation. | |
| ▲ | zahlman 3 hours ago | parent | prev [-] | | If you want them to be able to write code and then run tests on that code, it can be a bit difficult to restrict access meaningfully.... | | |
| ▲ | amluto 2 hours ago | parent [-] | | Only for code that can’t be tested in an isolated environment, and designing code that can’t be tested in an isolated environment is generally a mistake for quite a few reasons. |
|
| |
| ▲ | Romario77 4 hours ago | parent | prev | next [-] | | If you read what happened it's not that cut&dry. Railway (their cloud provider) gave them a token for operations. The AI was working on staging at the moment. Since the token had wide range permissions AI used it in it's routine operations to delete a volume to fix something and this resulted in their prod and backup data deletion. So, here at least some of the blame belongs to Railway - how they organized their security, how the volume deletion deletes backups as well. They since fixed some of these issues, so a similar mistake from someone won't be as catastrophic. | |
| ▲ | dylan604 4 hours ago | parent | prev | next [-] | | > I honestly don’t understand why people blame AI here, Are you being hyperbolic here? Of course you understand why. Most people would much rather push blame somewhere else, anywhere else, than to accept fault for themselves. Whether that's because of fear of losing job or personal reputation, the reasoning doesn't really matter. | |
| ▲ | locknitpicker 3 hours ago | parent | prev [-] | | > I’d like to rephrase this as: this is why you don’t give interns permissions to delete your prod database. Nowadays AI code assistants are designed to execute their tools in your personal terminals using your personal credentials with access to all your personal data. See how every single AI integration extension for any IDE works. You cannot shift blame if by design it is using your credentials for everything it does. |
|
|
| ▲ | xmcp123 17 minutes ago | parent | prev | next [-] |
| It's a weird world. I also feel pretty confident that if I was an intern who hallucinated regularly at work, I would have been fired, even if I was working for free. |
|
| ▲ | giancarlostoro 5 hours ago | parent | prev | next [-] |
| Yeah, I don't know why anyone would open up a codebase with any prod credentials with an LLM or give prod credentials to an intern / junior developer. I always intentionally had a "PROD" only checkout of my projects so I knew if I was going to try and run it in a PROD mode, that I was going out of my way, there even used to be a VS extension that would change the color of VS completely based on your SLN file path, so I could easily remember which color for VS was for production vs development. I'd have basically a copy that would always be on the latest of the master branch for ease of confirmation. |
| |
| ▲ | ryandrake 4 hours ago | parent [-] | | It should take more than "credentials" to even access the prod database, let alone delete it. There's actual customer data there, likely personally identifiable information, maybe their home address, phone number, even real time location? Very sensitive stuff. It should be a Very Big Deal to even access prod. Giving an engineer routine access to prod is a root problem here, along with that engineer laundering that access and giving it to an LLM. At many serious companies, even an insider attempt to access prod could light up a dashboard somewhere, and you might get a call from IT security. | | |
| ▲ | giancarlostoro 4 hours ago | parent [-] | | Yeah, I'm lucky if I even get READ ONLY credentials for prod in some cases. I don't know why anyone would have all the keys to the prod kingdom. |
|
|
|
| ▲ | AmbroseBierce 3 hours ago | parent | prev | next [-] |
| Well, a significant difference is nobody is selling the concept of interns as the end-all solution of humanity's problems, unlike AI. |
| |
| ▲ | program_whiz 2 hours ago | parent [-] | | Yeah the usual mott and bailey. Monday -- AI is taking over the world, tremble in fear! Tuesday -- sure it did a boneheaded thing, its just a tool, no better than an intern, actually its _your_ fault, all the data in the entire world isn't enough to train this system not to delete prod! |
|
|
| ▲ | engeljohnb 2 hours ago | parent | prev [-] |
| Interns are human. Humans can always be held accountable. A computer never can. Therefore, no one should leave a computer in charge of human decisions. |
| |
| ▲ | edot 2 hours ago | parent [-] | | Exactly. Thus the blame when an LLM does something dumb should fall on the human who owns the implementation of said LLM. A dead simple example: if I paste confidential information into ChatGPT, that’s on me. If I let Codex have access to an environment where it can get to confidential information, that’s also on me. At best I could also blame my IT department for giving me technical permissions to do such a thing, but still it’s humans at fault (and I believe in taking Extreme Ownership, so I wouldn’t even do that). LLMs are just technology like any other. |
|