Remix.run Logo
WarmWash 3 hours ago

AI plans are not going to stay at $20/mo.

People will go to alternative models, but it likely will be as popular as Linux.

pyrophane 3 hours ago | parent | next [-]

Yeah, this is something I am thinking a lot about. Companies won't be able to sustain this level of spending forever, and one of two things will need to happen:

1. Models become commodities and immensely cheaper to operate for inference as a result of some future innovation. This would presumably be very bad for the handful of companies who have invested that $1T and want to recoup that, but great for those of us who love cheap inference.

2. #1 doesn't happen and the model providers start begin to feel empowered to pass the true cost of training + inference down to the model consumer. We start paying thousands of dollars per month for model usage and the price gate blocks out most people from reaping the benefits of bleeding-edge AI, instead being locked into cheaper models that are just there to extract cash by selling them things.

Personally I'm leaning toward #1. Future models near as good as the absolute best will get far cheaper to train, and new techniques and specialized inference chips will make them much cheaper to use. It isn't hard for me to imagine another Deepseek moment in the not-so-distant future. Perhaps Anthropic is thinking the same thing given the rumors that they are rumored to be pushing toward an IPO as early as this year.

WarmWash 2 hours ago | parent [-]

Back of the envelope calculations point to $60-$80/mo plans for 5-10y payback period.

This also fits with OpenAIs announced advertising cost, and is something most consumers can stomach.

sambull 3 hours ago | parent | prev | next [-]

That why they need widen the moat; it appears not giving us access to hardware might be that moat.

They desperately need LLMs to stay rentier and hardware advances are a direct attack on their model.

general1465 3 hours ago | parent | prev | next [-]

Economics will be decisive force. Paying 1000USD a month for AI or buying server for 10kUSD, loading there Chinese AI model which can do 90% of what SOTA models can? Looks like a no brainer.

nebula8804 2 hours ago | parent [-]

Man if China can catch up on the hardware front we could be seeing the 'TikTok' story repeat there. (They provide a better product>US govt panics> bans the US from the good stuff)

scrollop 2 hours ago | parent | prev | next [-]

Yeah, they'll be free - on device and "good enough".

If you want the best, then pay.

wslh 3 hours ago | parent | prev | next [-]

Possibly, but that assumes continuity. New math and algorithmic breakthroughs could make much of today’s AI stack legacy, reshuffling both costs and winners.

co_king_3 3 hours ago | parent | prev [-]

I don't know about you, but I benefit so much from using Claude at work that I would gladly pay $1,500-$2,000 per month to keep using it.

galleywest200 3 hours ago | parent | next [-]

That is more than one month rent for most of the world. Most people are simply not going to pay this.

wongarsu 2 hours ago | parent | next [-]

My rent is less than that. But if you add up salary, payroll taxes, benefits, social security etc my employer still spends around four times that amount on employing me. More if you include misc overheads associated with having one more employee. Personally I could never afford 1500-2000€/month for dev tooling, but my employer should rationally be willing to spend that for anything that makes me more than 25% more effective.

I'm not sure today's Claude Code could ask for that. But I don't think it would be a crazy goal for them to work towards

sarchertech 2 hours ago | parent [-]

There have been many many productivity improvements over the last 50 years that provided more than a 25% boost. I’ve yet to see an employer pay that much per employee for any of them.

Also a 25% boost per individual doesn’t necessarily equal a 25% boost to the final output if there are other bottlenecks.

co_king_3 3 hours ago | parent | prev [-]

Well then I'm sorry but unfortunately they are going to be left behind.

People who are cut out to be software developers can afford the means of production.

ekjhgkejhgk 3 hours ago | parent | next [-]

The people who own "the means of production" isn't you.

flir 3 hours ago | parent | prev | next [-]

A $2k/month model, should it ever arise, won't need you.

Octoth0rpe 3 hours ago | parent [-]

I haven't looked at a cost analysis recently, but it's possible that we basically already have $2k/month models, if they were priced to be even slightly profitable.

mirsadm 3 hours ago | parent | prev | next [-]

Sure they can also code without the help of a model, probably not that much slower.

throwaway77385 2 hours ago | parent | prev | next [-]

Things that can only be used by an exclusive elite don't tend to survive, unless we're talking super-yachts.

AI is only going to work if enough people can actually meaningfully use it.

Therefore, the monetisation model will have to adapt in ways that make it sustainable. OpenAI is experimenting with ads. Other companies will just subsidise the living daylights out of their solutions...and a few people will indeed run this stuff locally.

Look at how slow the adoption of VR has been. And how badly Meta's gamble on the metaverse went. It's still too expensive for most people. Yes, a small elite can afford the necessary equipment, but that's not a petri dish on which one can grow a paradigm-shift.

If only a few thousand people could afford [insert any invention here], that invention wouldn't be common-place nowadays.

Now, the pyramid has sort of been turned on its head, in the sense that things nowadays don't start expensive and then become cheaper, but instead start cheap and then become...something else, be that more expensive or riddled with ads. But there are limits to this.

> People who are cut out to be software developers

You mean the people AI is going to replace? What's the definition of 'cut out to be' here?

Waterluvian 3 hours ago | parent | prev | next [-]

Your identity as real software developer relies on the community's broad, inclusive definition of what it means to be one. Something you're failing to extend to others.

To be sitting that far out on a limb of software development while sawing at the branches of others is quite an interesting choice.

10 minutes ago | parent | prev | next [-]
[deleted]
mrbungie 3 hours ago | parent | prev | next [-]

Pretty edgy response. I'd say trying to scale in price rather than in quantity is a bad business strategy for tech period, specially if you hope to become Google-sized like OpenAI and company want.

actionfromafar 3 hours ago | parent | prev | next [-]

Are you OpenAI? If not, you don't afford the means of production. You're the sharecropper.

vultour 2 hours ago | parent | prev | next [-]

This is such a hilarious out of touch SV techbro comment I can't believe it's real. You're a monkey with a computer that knows how to Google, there's an endless amount of people who can replace you.

DJBunnies 3 hours ago | parent | prev [-]

Big yikes bro.

nightski 3 hours ago | parent | prev | next [-]

At that cost I'd just buy some GPUs and run a local model though. Maybe a couple RTX 6000s.

organsnyder 3 hours ago | parent | next [-]

That's about as much as my Framework Desktop cost (thankful that I bought it before all the supply craziness we're seeing across the industry). In the relatively small amount of time I've spent tinkering with it, I've used a local LLM to do some real tasks. It's not as powerful as Claude, but given the immaturity in the local LLM space—on both the hardware and software side—I think it has real potential.

Cloud services have a head-start for quite a few reasons, but I really think we could see local LLMs coming into their own over the next 3-5 years.

gbnwl 3 hours ago | parent | prev | next [-]

Same but I imagine once prices start rising the prices of GPUs that can run any decent local models will soar (again) as well. You and I wouldn’t be the only person with this idea right?

general1465 2 hours ago | parent [-]

I mean, will it? I would expect that all those GPUs and servers will ends up somewhere. Look on old Xeon servers, it all ended up in China. Nobody sane will buy 1U serve home, but Chinese has recycled these servers by making X99 motherboards which takes RAMs and Xeon CPUs from these noise servers and turning into PCs.

I would expect that they could sell something like AI computers with lot of GPU power created from similar recycled GPU clusters ussed today.

fishpham 3 hours ago | parent | prev [-]

Those won’t be sufficient to run SOTA/trillion parameter models

Zambyte 3 hours ago | parent | next [-]

And most tasks don't demand that.

general1465 2 hours ago | parent | prev [-]

Distilled models are good enough.

clownpenis_fart 3 hours ago | parent | prev [-]

I use my brain, it's free

co_king_3 3 hours ago | parent [-]

Fitting response for an account called "clownpenis_fart".

The future is here and it's time to stop ignoring it.

Your analog 1x productivity is worthless in comparison to my AI backed 10x productivity.

sarchertech 2 hours ago | parent | next [-]

10x productivity means you should have had time to build an your own programming language/OS/integrated dev environment or something equally impressive. Can you link to it?

throwaway77385 2 hours ago | parent | prev | next [-]

Yuck.

https://news.ycombinator.com/newsguidelines.html

> Be kind. Don't be snarky. Converse curiously; don't cross-examine. Edit out swipes.

> Comments should get more thoughtful and substantive, not less, as a topic gets more divisive.

> When disagreeing, please reply to the argument instead of calling names. "That is idiotic; 1 + 1 is 2, not 3" can be shortened to "1 + 1 is 2, not 3."

Honestly, if you just made your profile a day ago to yell overly confident and meaningless statements into the void, like a Mandrill in the jungle trying to shout over all the others, go back to LinkedIn, they like that kind of stuff there.

I even agree that AI has a place in our world and can greatly increase productivity. But we should talk about the how and why, instead of attacking others ad hominem and just stopping any discourse with absolutist nonsense.

Der_Einzige 2 hours ago | parent | prev [-]

People here are gonna be mad but they deserve to not reap what they don’t sow. Please keep triggering the snowflakes on this website with the truth.