| ▲ | snailmailman 10 hours ago | |||||||
Ah yes. “Non-existent security” is only a pesky detail that will surely be ironed out. It’s not a critical flaw in the entirety of the LLM ecosystem that now the computers themselves can be tricked into doing things by asking in just the right way. Anything in the context might be a prompt injection attack, and there isn’t really any reliable solution to that but let’s hook everything up to it, and also give it the tools to do anything and everything. There is still a long way to go to securing these. Apple is, I think wisely, staying out of this arena until it’s solved, or at least less of a complete mess. | ||||||||
| ▲ | mastermage 8 hours ago | parent [-] | |||||||
I think he was being sarcastic | ||||||||
| ||||||||