Remix.run Logo
Terr_ a day ago

My cynical rule of thumb: By default we should imagine LLMs like javascript logic offloaded into a stranger's web-browser.

The risks are similar: No prompts/data that go in can reliably be kept secret; A sufficiently-motivated stranger can have it send back completely arbitrary results; Some of those results may trigger very bad things depending on how you use or even just display them on your own end.

P.S. This conceptual shortcut doesn't quite capture the dangers of poison data, which could sabotage all instances even when they happen to be hosted by honorable strangers.