▲ | saberience 5 days ago | |||||||
I would say this is a feature, not a bug. Terminal and Bash or any shell can do this, if the user sucks. I want Claude Code to be able to do anything and everything, that's why it's so powerful. Sure, I can also make it do bad stuff, but that's like any tool. We don't ban knives because sometimes they kill people, because they're useful. | ||||||||
▲ | OJFord 5 days ago | parent | next [-] | |||||||
I would say it's neither, it's complacent misuse by the user. As you allude to we generally already are, but non-deterministic & especially 'agentic' AI makes the stakes/likelihood of it going wrong so much higher. Don't use an MCP server with permission (capability) to do more than you want, regardless of whether you think you're instructing the AI tool do the bad thing it's technically capable of. Don't run AI tools with filesystem access outside of something like a container with only a specific whitelist of directory mounts. Assume that the worst that could happen with the capability given will happen. | ||||||||
▲ | zahlman 5 days ago | parent | prev [-] | |||||||
> Terminal and Bash or any shell can do this, if the user sucks. But at least they will do it deterministically. | ||||||||
|