| ▲ | cousin_it 2 hours ago | |||||||||||||
Yeah. Even more than that, I think "prompt injection" is just a fuzzy category. Imagine an AI that has been trained to be aligned. Some company uses it to process some data. The AI notices that the data contains CSAM. Should it speak up? If no, that's an alignment failure. If yes, that's data bleeding through to behavior; exactly the thing SQL was trying to prevent with parameterized queries. Pick your poison. | ||||||||||||||
| ▲ | WarmWash an hour ago | parent [-] | |||||||||||||
We want a human level of discretion. | ||||||||||||||
| ||||||||||||||