Remix.run Logo
fsh 6 hours ago

I find these posts hilarious. LLMs are ultimately story generators, and "oops, I DROP'ed our production database" is a common and compelling story. No wonder LLM agents occasionally do this.

einrealist 6 hours ago | parent | next [-]

Also funny how people (including LLM vendors, like Cursor) think that rules in a system prompt (or custom rules) are real safety measures.

hunterpayne an hour ago | parent | prev | next [-]

Sure, but do junior devs find another key, in an unrelated file and use that key instead of their own? Maybe once you read about someone doing this and maybe it happened or maybe someone was being overly "creative" for entertainment purposes. But it probably doesn't happen in practice. The LLM making this mistake is becoming more and more frequent.

beej71 6 hours ago | parent | prev | next [-]

Like we say in adventure motorcycling: "It's never the stuff that goes right that makes the best stories." :)

Retr0id 4 hours ago | parent | prev | next [-]

It's also possible it's only a compelling story, and not based on any real events.

nothinkjustai 4 hours ago | parent | prev | next [-]

Yeah people don’t understand that if you put an LLM in a position where it’s plausible that a human might drop the DB, it very well might do that since it’s a likely next step. Ahahaha

efilife 4 hours ago | parent | prev [-]

This is exactly what I have in mind when something like this happens. Sometines it generates a story you want, sometimes not