Remix.run Logo
ogrisel 3 hours ago

I think there is far less than 1% chance for this to happen, but there are probably millions of antigravity users at this point, 1 millionths chance of this to happen is already a problem.

We need local sandboxing for FS and network access (e.g. via `cgroups` or similar for non-linux OSes) to run these kinds of tools more safely.

cube2222 3 hours ago | parent | next [-]

Codex does such sandboxing, fwiw. In practice it gets pretty annoying when e.g. it wants to use the Go cli which uses a global module cache. Claude Code recently got something similar[0] but I haven’t tried it yet.

In practice I just use a docker container when I want to run Claude with —-dangerously-skip-permissions.

[0]: https://code.claude.com/docs/en/sandboxing

BrenBarn 3 hours ago | parent | prev [-]

We also need laws. Releasing an AI product that can (and does) do this should be like selling a car that blows your finger off when you start it up.

jpc0 2 hours ago | parent | next [-]

This is more akin to selling a car to an adult that cannot drive and they proceed to ram it through their garage door.

It's perfectly within the capabilities of the car to do so.

The burden of proof is much lower though since the worst that can happen is you lose some money or in this case hard drive content.

For the car the seller would be investigated because there was a possible threat to life, for an AI buyer beware.

pas 2 hours ago | parent | prev | next [-]

there are laws about waiving liability for experimental products

sure, it would be amazing if everyone had to do a 100 hour course on how LLMs work before interacting with one

chickensong 34 minutes ago | parent | prev [-]

Google will fix the issue, just like auto makers fix their issues. Your comparison is ridiculous.