Remix.run Logo
throwaway277432 3 hours ago

Unironically yes.

I predict that costs will grow to 80% of what it would cost a human, across the board for everything AI can do.

"It's still cheaper than a human" they'll say. Loudly here on HN too.

Of course this will happen slowly, very slowly. Lets meet again in 10-20 years.

revolvingthrow 3 hours ago | parent | next [-]

If openai / anthropic / google were the only game in town then yea, we’d already be paying 5x as much as we do. But local models are so close to sota that it just isn’t going to happen. If I’m a lawyer getting billed $500k/yr on $600k profit I’d rather buy a chonky server and run a model that’s 90% as good and get my money back in 2 years, then pay $5k electricity on $600k profit.

Nobody will successfully lobby for banning local models either, it just isn’t going to happen when the rest of the world will happily avoid paying 80% of their profits to some US bigco for the privilege of existing.

cactusplant7374 30 minutes ago | parent [-]

Could you really build something sophisticated with a local model? Let's say a linux kernel.

realusername 26 minutes ago | parent [-]

I'm using Codex with the Linux kernel and I discard maybe 80% of what it produces. This isn't an area which the top models have solved.

stronglikedan 32 minutes ago | parent | prev | next [-]

I don't think costs will grow on either side in the long term. In the short term, yes, but once they get the infrastructure in place to support AI, costs will go down. Right now, they're on borrowed infra.

GrinningFool an hour ago | parent | prev | next [-]

> I predict that costs will grow to 80% of what it would cost a human, across the board for everything AI can do.

80% of a human's price varies greatly by region. 80% of the lowest-priced effort-of- humans in this space right now will probably not be sustainable for the sellers.

KronisLV 3 hours ago | parent | prev | next [-]

> "It's still cheaper than a human" they'll say.

The question is how much friction there will be for people to switch over to Gemini, GPT or maybe even DeepSeek or Mistral or whatever. Even if price hikes are inevitable across the board, the moat any single org has is somewhat limited, so prices definitely will be a factor they'll compete on with one another at least a bit.

RussianCow 3 hours ago | parent [-]

> the moat any single org has is somewhat limited

I disagree. The models are going to become commodities (we're already almost there), but the tooling and integrations will be the moat. Reproducing everything Anthropic has already built with Claude Code, Cowork, and all their connectors would be nontrivial, and they're just getting started.

Anyone can implement an AI chatbot. But few will be able to provide AI that's deeply integrated into our daily lives.

KronisLV 2 hours ago | parent [-]

> Reproducing everything Anthropic has already built with Claude Code, Cowork, and all their connectors would be nontrivial, and they're just getting started.

They're one org with presumably some specific direction. As the actual models get better, expect a large part of the dev community iterating on tools way more easily, sometimes ones that Anthropic doesn't quite have an equivalent to - for example, just recently Cline released their Kanban solution to dish out tasks to agents (https://cline.bot/kanban), OpenCode has been around for a while for the agentic stuff (https://opencode.ai/) and now has a desktop and web version as well, alongside dozens of others. Cline and KiloCode also have decent browser automation.

I will admit that everyone working on everything at the same time definitely means limitless reinvention of the wheel and some genuinely good initiatives dying off along the way (I personally liked RooCode more than both the Cline and KiloCode for Visual Studio Code, sad to see them go), but I doubt we're gonna see a lack of software. Maybe a lack of good software, though; not like Anthropic or any org has any moat there either, since they're under the additional pressure of having to do a shitload of PR and release new models and keep up appearances, compared to your average dev just pushing to GitHub (unless they want corporate money, in which case they do need some polish).

pingou 3 hours ago | parent | prev | next [-]

This is assuming there will be no competition. But why wouldn't there be? Especially since you can use open source models, which are not too far from frontier models (from now).

vidarh 2 hours ago | parent | prev | next [-]

Kimi and GLM 5.1 are already capable of handling a good chunk of my tasks. They about to lose the leverage to allow them to drastically increase prices - enough models are 6-12 months away from being good enough large proportions of their customers uses.

mystraline 3 hours ago | parent | prev [-]

Its not20 years. Its now. Nvidia has already said that tokens cost more than humans.

https://finance.yahoo.com/sectors/technology/articles/cost-c...