Remix.run Logo
Sohcahtoa82 2 hours ago

Your comment is a perfect example of not caring about nuance. More charitably, it comes from a place of naivete about how LLMs work.

LLMs are non-deterministic [0]. They can't be trusted to fully follow your prompts. As such, you have to be careful about what permissions they have.

Like...I use Claude Code. I allow it to run some shell commands that only read (grep, ls, find, etc.). I will never allow it to run Python code without checking with me first. Yeah, it slows me down when I have to answer its prompt for permission to run Python, but the alternative is outright dangerous.

Compare this with any other tool, say, something as simple as `rm`. I expect that if I call `rm some.file`, it will only delete that file. If it deletes anything else, that's absolutely the fault of the tool, and I should not bear any responsibility for mistakes the tool makes as long as my input was correct.

I do not give LLMs that same latitude. LLMs operate probabilistically and have far more degrees of freedom in how they interpret and act on your input, so you hold them (and yourself) to a different standard of scrutiny and accountability.

[0] Technically, LLMs are actually completely deterministic. Run any given input through the neural network, and you'll get the exact same output [1], but that output is a list of probabilities of the next potential token. Top-k sampling, temperature, and other options essentially randomize the chosen token, making them non-deterministic in practice, though APIs will often allow you to disable all that and make them deterministic.

[1] Even this statement isn't quite true because floating point math is not associative.

pier25 37 minutes ago | parent [-]

> LLMs are non-deterministic

For someone who complains about a lack of nuance it's surprising you're completely missing my point. The lack of trust is precisely my point.

Either AI companies are made accountable for providing un unreliable service or they need to stop selling and marketing these LLMs as if they were infallible.