| ▲ | johnisgood 7 hours ago | |
The response to the user is itself an exfiltration channel. If the LLM can read secrets and produce output, an injection can encode data in that output. You haven not cut off a leg, you have just made the attacker use the front door, IMO. | ||
| ▲ | 6 hours ago | parent [-] | |
| [deleted] | ||