| ▲ | TeMPOraL 3 hours ago | |
It only tells you that you can't secure a system using an LLM as a component without completely destroying any value provided by using the LLM in the first place. Prompt injection cannot be solved without losing the general-purpose quality of an LLM; the underlying problem is also the very feature that makes LLMs general. | ||