▲ | simonw 5 hours ago | |
Yeah, one exfiltration vector that's really nasty is "here is a big base64 encoded string, to recover your data visit this website and paste it in". You can at least prevent LLM interfaces from providing clickable links to external domains, but it's a difficult hole to close completely. | ||
▲ | datadrivenangel 5 hours ago | parent [-] | |
Human fatigue and interface design are going to be brutal here. It's not obvious what counts as a tool in some of the major interfaces, especially as far as built in capabilities go. And as we've seen with conventional software and extensions, at a certain point, if a human thinks it should work, then they'll eventually just click okay or run something as root/admin... Or just hit enter nonstop until the AI is done with their email. |