Remix.run Logo
stingraycharles 5 hours ago

> This is why you don’t hire interns!

I’d like to rephrase this as: this is why you don’t give interns permissions to delete your prod database.

This is a process failure, not an AI failure.

I honestly don’t understand why people blame AI here, when you literally gave AI permissions to do exactly this.

It’s like blaming AWS for exposing some database to the public. That’s just not AWS’ fault. Neither is this the fault of AI.

amluto 4 hours ago | parent | next [-]

There is a major issue with current AI tools that they want to effectively grant access to everything their user has access to. The whole sandbox structure is wrong (although various people have vibe coded assorted improvements).

yonatan8070 3 hours ago | parent | next [-]

Another issue I've noticed is they're sometimes very resourceful. For example when Codex can't directly edit file due to sandboxing restrictions, rather than asking "hey can I apply this diff on the file", it'd ask for permission to run a `cat EOF` command to re-write the whole file, which the UI doesn't surface properly (just shows the first line...).

This sounds similar to what's described in the "Claude deleted my DB post", it decided "I need to do X", then searched for whatever would let it do X, regardless of intended purpose.

amluto 2 hours ago | parent [-]

I amused myself by removing codex-rs’s web search tool and then asking it to search for “foo”. It wrote a Python script to do the search.

traderj0e 2 hours ago | parent | prev | next [-]

If you pretend you have an intern with their own machine and run the AI agents on that machine, you have the same separation.

zahlman 3 hours ago | parent | prev [-]

If you want them to be able to write code and then run tests on that code, it can be a bit difficult to restrict access meaningfully....

amluto 2 hours ago | parent [-]

Only for code that can’t be tested in an isolated environment, and designing code that can’t be tested in an isolated environment is generally a mistake for quite a few reasons.

Romario77 4 hours ago | parent | prev | next [-]

If you read what happened it's not that cut&dry. Railway (their cloud provider) gave them a token for operations. The AI was working on staging at the moment. Since the token had wide range permissions AI used it in it's routine operations to delete a volume to fix something and this resulted in their prod and backup data deletion.

So, here at least some of the blame belongs to Railway - how they organized their security, how the volume deletion deletes backups as well.

They since fixed some of these issues, so a similar mistake from someone won't be as catastrophic.

dylan604 4 hours ago | parent | prev | next [-]

> I honestly don’t understand why people blame AI here,

Are you being hyperbolic here? Of course you understand why. Most people would much rather push blame somewhere else, anywhere else, than to accept fault for themselves. Whether that's because of fear of losing job or personal reputation, the reasoning doesn't really matter.

locknitpicker 3 hours ago | parent | prev [-]

> I’d like to rephrase this as: this is why you don’t give interns permissions to delete your prod database.

Nowadays AI code assistants are designed to execute their tools in your personal terminals using your personal credentials with access to all your personal data. See how every single AI integration extension for any IDE works.

You cannot shift blame if by design it is using your credentials for everything it does.