Remix.run Logo
jacquesm 2 hours ago

> (It's also why "hallucinations" and "prompt injection" are not bugs, but fundamental facets of what makes LLMs useful. They cannot and will not be "fixed", any more than you can "fix" humans to be immune to confabulation and manipulation. It's just the nature of fully general sytems.)

This is spot on and one of the reasons why I don't think putting LLMs or LLM based devices into anything that requires security is a good idea.