Remix.run Logo
nemomarx a day ago

The way to stop it from executing an action is probably having controls on the action and an not the llm? white list what api commands it can send so nothing harmful can happen or so on.

omneity 20 hours ago | parent | next [-]

This is similar to the halting problem. You can only write an effective policy if you can predict all the side effects and their ramifications.

Of course you could do like deno and other such systems and just deny internet or filesystem access outright, but then you limit the usefulness of the AI system significantly. Tricky problem to be honest.

Scarblac a day ago | parent | prev [-]

It won't be long before people start using LLMs to write such whitelists too. And the APIs.