| ▲ | spuz 4 hours ago |
| There are a lot of people who don't know stuff. Nothing wrong with that. He says in his video "I love Google, I use all the products. But I was never expecting for all the smart engineers and all the billions that they spent to create such a product to allow that to happen. Even if there was a 1% chance, this seems unbelievable to me" and for the average person, I honestly don't see how you can blame them for believing that. |
|
| ▲ | ogrisel 3 hours ago | parent | next [-] |
| I think there is far less than 1% chance for this to happen, but there are probably millions of antigravity users at this point, 1 millionths chance of this to happen is already a problem. We need local sandboxing for FS and network access (e.g. via `cgroups` or similar for non-linux OSes) to run these kinds of tools more safely. |
| |
| ▲ | cube2222 3 hours ago | parent | next [-] | | Codex does such sandboxing, fwiw. In practice it gets pretty annoying when e.g. it wants to use the Go cli which uses a global module cache. Claude Code recently got something similar[0] but I haven’t tried it yet. In practice I just use a docker container when I want to run Claude with —-dangerously-skip-permissions. [0]: https://code.claude.com/docs/en/sandboxing | |
| ▲ | BrenBarn 3 hours ago | parent | prev [-] | | We also need laws. Releasing an AI product that can (and does) do this should be like selling a car that blows your finger off when you start it up. | | |
| ▲ | jpc0 2 hours ago | parent | next [-] | | This is more akin to selling a car to an adult that cannot drive and they proceed to ram it through their garage door. It's perfectly within the capabilities of the car to do so. The burden of proof is much lower though since the worst that can happen is you lose some money or in this case hard drive content. For the car the seller would be investigated because there was a possible threat to life, for an AI buyer beware. | |
| ▲ | pas 2 hours ago | parent | prev | next [-] | | there are laws about waiving liability for experimental products sure, it would be amazing if everyone had to do a 100 hour course on how LLMs work before interacting with one | |
| ▲ | chickensong 36 minutes ago | parent | prev [-] | | Google will fix the issue, just like auto makers fix their issues. Your comparison is ridiculous. |
|
|
|
| ▲ | Vinnl an hour ago | parent | prev [-] |
| Didn't sound to me like GP was blaming the user; just pointing out that "the system" is set up in such a way that this was bound to happen, and is bound to happen again. |