Remix.run Logo
sailfast 2 days ago

How do you prevent these models from reading secrets in your repos locally?

It’s one thing for the ENVs to be user pasted but typically you’re also giving the bots access to your file system to interrogate and understand them right? Does this also block that access for ENVs by detecting them and doing granular permissions?

SparkyMcUnicorn 2 days ago | parent | next [-]

I configure permission settings within projects.

https://code.claude.com/docs/en/settings#permission-settings

sailfast a day ago | parent [-]

Ah yes - this is the way. Thanks.

woodrowbarlow a day ago | parent [-]

this prevents claude from directly reading certain files, but doesn't prevent claude from running a command that dumps the file on stdout and then reading stdout... claude will just try to "cat" the file if it decides it wants to see it.

sailfast 15 hours ago | parent [-]

Yeah - that’s kinda what I was thinking. Unless you’re doing quite granular approvals it gets tricky.

woodrowbarlow 2 days ago | parent | prev [-]

by putting secrets in your environment instead of in your files, and running AI tools in a dedicated environment that has its own set of limited and revocable secrets.

sailfast a day ago | parent [-]

Yes - separate secrets always - but you've still got local or dev secrets. Seems like the above permissions are the right way to go in the end. Thanks.