Remix.run Logo
cousin_it 2 hours ago

Yeah. Even more than that, I think "prompt injection" is just a fuzzy category. Imagine an AI that has been trained to be aligned. Some company uses it to process some data. The AI notices that the data contains CSAM. Should it speak up? If no, that's an alignment failure. If yes, that's data bleeding through to behavior; exactly the thing SQL was trying to prevent with parameterized queries. Pick your poison.

WarmWash an hour ago | parent [-]

We want a human level of discretion.

AlotOfReading an hour ago | parent | next [-]

Organizations struggle even letting humans use their discretion. Pretty much every retail worker has encountered a rigidly enforced policy that would be better off ignored in most cases.

jacquesm an hour ago | parent | prev [-]

Yes, because humans would never fall for instructions embedded in data. If they did we'd surely have a name for something like that ;)

By the way, when was the last time you looked out of your window?