Remix.run Logo
hiddencost 7 hours ago

That presumes that performance improvements are necessary for commercialization.

From what I've seen the models are smart enough, what we're lacking is the understanding and frameworks necessary to use them well. We've barely scratched the surface on commercialization. I'd argue there are two things coming:

-> Era of Research -> Era of Engineering

Previous AI winters happened because we didn't have a commercially viable product, not because we weren't making progress.

ares623 7 hours ago | parent | next [-]

The labs can't just stop improvements though. They made promises. And the capacity to run the current models are subsidized by those promises. If the promise is broken, then the capacity goes with it.

selectodude 5 hours ago | parent | next [-]

> the capacity goes with it.

Sort of. The GPUs exist. Maybe LLM subs can’t pay for electricity plus $50,000 GPUs, but I bet after some people get wiped out, there’s a market there.

simianparrot 2 hours ago | parent [-]

Datacenter GPU's have a lifespan of 1-3 years depending on use. So yes they exist, but not for long, unless they go entirely unused. But then they also deprecate in efficiency compared to new hardware extremely fast as well, so their shelf life is severely limited either way.

nsomaru 2 hours ago | parent | next [-]

Personally I am waiting for the day I can realistically buy a second hand three year old datacentre GPU so I can run Kimi K2 in my shed. Given enough time, not a pipe dream. But 10 years at least.

tim333 2 hours ago | parent [-]

You'll probably be able to run Kimi K2 on the iphone 27.

soulofmischief 2 hours ago | parent | prev [-]

At this pace, it won't be many years before the industry is dependent on resource wars in order to sustain itself.

wmf 5 hours ago | parent | prev [-]

Maybe those promises can be better fulfilled with products based on current models.

AstroBen 7 hours ago | parent | prev | next [-]

We still don't have a commercially viable product though?

zaptrem 5 hours ago | parent | next [-]

I've fed thousands of dollars to Anthropic/OAI/etc for their coding models over the past year despite never having paid for dev tools before in my life. Seems commercially viable to me.

chroma205 4 hours ago | parent | next [-]

> I've fed thousands of dollars to Anthropic/OAI/etc for their coding models over the past year despite never having paid for dev tools before in my life. Seems commercially viable to me.

For OpenAI to produce a 10% return, every iPhone user on earth needs to pay $30/month to OpenAI.

That ain’t happening.

menaerus 3 hours ago | parent | next [-]

They don't sell their models to individuals only but also to companies with most likely different business and pricing models so that's an overly simplistic view of their business. YoY their spending increases, we can safely assume that one of the reasons is the growing user base.

Time will probably come when we won't be allowed to consume frontier models without paying anything, as we can today, and time will come when this $30 will most likely become double or triple the price.

Though the truth is that R&D around AI models, and especially their hosting (inference), is expensive and won't get any cheaper without significant algorithmic improvements. According to the history, my opinion is that we may very well be ~10 years from that moment.

EDIT: HSBC has just published some projections. From https://archive.ph/9b8Ae#selection-4079.38-4079.42

> Total consumer AI revenue will be $129bn by 2030

> Enterprise AI will be generating $386bn in annual revenue by 2030

> OpenAI’s rental costs will be a cumulative $792bn between the current year and 2030, rising to $1.4tn by 2033

> OpenAI’s cumulative free cash flow to 2030 may be about $282bn

> Squaring the first total off against the second leaves a $207bn funding hole

So, yes, expensive (mind the rental costs only) ... but forseen to be penetrating into everything imagineable.

krige 2 hours ago | parent [-]

>> OpenAI’s cumulative free cash flow to 2030 may be about $282bn

According to who, OpenAI? It is almost certain they flat out lie about their numbers as suggested by their 20% revenue shares with MS.

menaerus 2 hours ago | parent [-]

A bank - HSBC. Read the article.

zaptrem 4 hours ago | parent | prev [-]

Not sure where that math is coming from. Assuming it's true, you're ignoring that some users (me) already pay 10X that. Btw according Meta's SEC filings: https://s21.q4cdn.com/399680738/files/doc_financials/2023/q4... they made around $22/month/american user (not even heavy user or affluent iPhone owner) in q3 2023. I assume Google would be higher due to larger marketshare.

lovich 3 hours ago | parent | prev [-]

If you fed thousands of dollars to them, but it cost them tens of thousands of dollars in compute, it’s not commercially viable.

None of these companies have proven the unit economics on their services

aurareturn 5 hours ago | parent | prev | next [-]

If all frontier LLM labs agreed to a truce and stopped training to save on cost, LLMs would be immensely profitable now.

AstroBen 5 hours ago | parent [-]

That isn't what I've seen: https://www.wheresyoured.at/oai_docs/

aurareturn 5 hours ago | parent [-]

https://simonwillison.net/2025/Aug/17/sam-altman/#:~:text=Su...

Also independent analysis: https://news.ycombinator.com/threads?id=aurareturn&next=4596...

amypetrik8 6 hours ago | parent | prev | next [-]

google what you just said and look at the top hit

it's a AI summary

google eats that ad revenue

it eats the whole thing

it blocked your click on the link... it drinks your milkshake

so, yes, there a 100 billion commercially viable product

bakedoatmeal 5 hours ago | parent | next [-]

Google Search has 3 sources of revenue that I am aware of: ad revenue from the search results page, sponsored search results, and AdSense revenue on the websites the user is directed to.

If users just look at the AI overview at the top of the search page, Google is hobbling two sources of revenue (AdSense, sponsored search results), and also disincentivizing people from sharing information on the web that makes their AI overview useful. In the process of all this they are significantly increasing the compute costs for each Google search.

This may be a necessary step to stay competitive with AI startups' search products, but I don't think this is a great selling point for AI commercialization.

skylissue 5 hours ago | parent | prev [-]

And so ends the social contract of the web, the virtuous cycle of search engines sending traffic to smaller sites which collect ad revenue which in turn boosts search engine usage.

To thunderous applause.

cindyllm 5 hours ago | parent [-]

[dead]

nimchimpsky 4 hours ago | parent | prev [-]

[dead]

BenGosub an hour ago | parent | prev | next [-]

Besides building the tools for proper usage of the models, we also need smaller, domain specific models that can run with fewer resources

catigula 7 hours ago | parent | prev [-]

I don’t think the models are smart at all. I can have a speculative debate with any model about any topic and they commit egregious errors with an extremely high density.

They are, however, very good at things we’re very bad at.