Remix.run Logo
TeMPOraL 5 hours ago

Unfortunately, prompt injection isn't like SQL injection - it's like social engineering. It cannot be solved, because at a fundamental level, this "vulnerability" is also the very thing that makes the language models tick, and why they can be used as general purpose problem solvers. Can't have one without the other, because "code" and "data" distinction does not exist in reality. Laws of physics do not recognize any kind of "control band" and "data band" separation. They cannot, because what part of a system is "code" and what is "data" depends not on the system, but the perspective through which one looks at it.

There's one reality, humans evolved to deal with it in full generality, and through attempts at making computers understand human natural language in general, LLMs are by design fully general systems.