| ▲ | TeMPOraL 3 hours ago | |
Airbags, yes. But you can't just make it provably impossible for a car to crash into something and hurt/kill its occupants, other than not building it in the first place. Same with LLMs - you can't secure them like regular programs without destroying any utility they provide, because their power comes from the very thing that also makes them vulnerable. | ||
| ▲ | yencabulator an hour ago | parent [-] | |
I see you've given up. I haven't. LLM inside deterministic guardrails is a pretty good combo. | ||