Remix.run Logo
avaer 5 hours ago

> for normie agents to take off in the way that you expect, you're going to have to grant them with full access

At this point it's a foregone conclusion this is what users will choose. It'll be like (lack of) privacy on the internet caused by the ad industrial complex, but much worse and much more invasive.

The threats are real, but it's just a product opportunity to these companies. OpenAI and friends will sell the poison (insecure computing) and the antidote (Mythos et all) and eat from both ends.

Anyone trying to stay safe will be on the gradient to a Stallmanesque monastic computing existence.

I don't want this, I just think it's going down that route.

intended 5 hours ago | parent | next [-]

There was a recent Stanford study which showed that AI enthusiasts and experts and the normies had very different sentiment when it came to AI.

I think most people are going to say they dont want it. I mean, why would anyone want a tool that can screw up their bank account? What benefit does it gain them?

Theres lots of cases of great highly useful LLM tools, but the moment they scale up you get slammed by the risks that stick out all along the long tail of outcomes.

ryandrake 4 hours ago | parent [-]

I agree, in general we are going to find that ultimately most employee end users don't want it. Assuming it actually makes you more productive. I mean, who the hell wants to be 10X more productive without a commensurate 10X compensation increase? You're just giving away that value to your employer.

On the other hand, entrepreneurs and managers are going to want it for their employees (and force it on them) for the above reason.

retinaros 4 hours ago | parent | prev [-]

I dont see companies doing that. it can be business ending. only AI bros buying mac mini in 2026 to setup slop generated Claws would do that but a company doing that will for sure expose customer data.