Remix.run Logo
onlyrealcuzzo 2 hours ago

> each model is roughly 2x profitable on its own, but each next model costs 10x the last. The whole thing only works if scaling keeps delivering.

This is a decent argument, but it's not the death knell you think.

Models are getting 99% more efficient every 3 years - to get the same amount of output, combined with hardware and (mostly) software upgrades - you can use 99% less power.

The number of applications where AI is already "good enough" keeps growing every day. If the cost goes down 99% every three years, it doesn't take long until you can make a ton of money on those applications.

If AI stopped progressing today, it would take probably a decade or longer for us to take full advantage of it. So there is tons of forward looking revenue that isn't counted yet.

For the foreseeable future, there are MANY MANY uses of models where a company would not want to host its own models and would be GLAD to pay an 4-5x cost for someone else to host the model and hardware for them.

I'm as bullish on OpenAI being "worth" $730B as I was on Snap being worth what it IPO'd for - which it's still down about 80% (AFTER inflation, or about ~95% adjusting for gold inflation).

But guess what - these are MINIMUM valuations based on 50-80% margins - i.e. they're really getting about ~$30B - the rest is market value of hardware and hosting. OpenAI could be worth 80% less, and they could still make a metric fuck-ton of money selling at IPO with a $1T+ market cap to speculative morons easily...

Realistically, very rich people with high risk tolerance are saying that they think OpenAI has a MINIMUM value of ~$100B. That seems very reasonable given the risk tolerance and wealth.

christoff12 an hour ago | parent | next [-]

> Models are getting 99% more efficient every 3 years - to get the same amount of output, combined with hardware and (mostly) software upgrades - you can use 99% less power.

Even if true, this still doesn't bend the curve when paying for the next model.

> If AI stopped progressing today, it would take probably a decade or longer for us to take full advantage of it. So there is tons of forward looking revenue that isn't counted yet.

If this is true, it's true for the technology overall, and not necessarily OpenAI since inference would get commoditized quickly at that point. OpenAI could continue to have a capital advantage as a public stock, but I don't think it would if the music stopped.

XenophileJKO 37 minutes ago | parent [-]

I would actually like to see the real math currently.

The market adoption has increased a lot. The cost to serve has come down a lot per token.

Model sizes have not increased exponentially recently (The high point being the aborted GPT-4.5), most refinement recently seems to be extending training on relatively smaller models.

When you take this into account together, the relative training to inference income/cost ratio likely has actually changed dramatically.

blmarket an hour ago | parent | prev | next [-]

> 99% more efficient every 3 years

It's 2x efficiency. Then I'd take 50% less power instead of ridiculous 99% less power.

sigmoid10 40 minutes ago | parent [-]

GPT-4 came out 3 years ago and you can run comparable models for 1% of the cost nowadays. That is not 2x efficiency. That's two orders of magnitude in end-to-end compute efficiency.

danielparsons 28 minutes ago | parent | next [-]

you're looking at nearly the entire curve of the tech's development. that's like saying lightbulbs became 99% more energy efficient and therefore will become another 99% more energy efficient. but most techs follow an S curve.

MengerSponge 12 minutes ago | parent [-]

But S curves are boring and dont moon

swingboy 31 minutes ago | parent | prev [-]

How do we know how much it costs? Or is this just based off the token pricing?

dfp33 an hour ago | parent | prev | next [-]

"If AI stopped progressing today, it would take probably a decade or longer for us to take full advantage of it."

AI stopped progressing, or LLMs? I really dislike people throwing the term AI around.

grosswait an hour ago | parent [-]

For the purposes of their argument, I don’t think the distinction matters.

robotpepi 35 minutes ago | parent | prev | next [-]

ok, but everything depends on your numbers being correct. 99% improved efficiency seems kind of a way too optimistic prediction.

kortilla an hour ago | parent | prev | next [-]

> Models are getting 99% more efficient every 3 years

The LLM industry has only be around for like 4 years. Extrapolating trends from that is pretty naive.

moron4hire an hour ago | parent | prev [-]

We said all the same shit about VR, dude. Even had a global pandemic show up to boost everyone's interest in the key market of telepresence. Turns out the merry go round can stop abruptly.

solumunus 40 minutes ago | parent [-]

Did we?! You and Mark Zuckerberg maybe.