Remix.run Logo
bgirard 3 hours ago

It's a fun trope to repeat but that's not what OpenAI is doing. I get a ton of value from ChatGPT and Codex from my subscription. As long as the inference is not done at a lost this analogy doesn't hold. They're not paying me to use it. They are generating output that is very valuable to me. Much more than my subscription cost.

I've been able to help setup cross app automation for my partner's business, remodel my house, plan a trip of Japan and assist with the cultural barrier, vibe code apps, technical support and so much more.

bloppe 3 hours ago | parent | next [-]

To be fair, I would get a ton of value out of someone selling dollars for 20 cents apiece.

But ya, OAI is clearly making a ton of revenue. That doesn't mean it's a good business, though. Giving them a 20 year horizon, shareholders will be very upset unless the firm can deliver about a trillion in profit, not revenue, to justify the 100B (so far) in investment, and that would barely beat the long term s&p 500 average return.

But Altman himself has said he'll need much more investment in the coming years. And even if OAI became profitable by jacking up prices and flooding gpt with ads, the underlying technology is so commodified, they'd never be able to achieve a high margin, assuming they can turn a profit at all.

littlestymaar 2 hours ago | parent [-]

I'd be a little bit more nuanced:

I think there's something off with their plans right now: it's pretty clear at this point that they can't own the technological frontier, Google is just too close already and from a purely technological PoV they are much better suited to have the best tech in the medium term. (There's no moat and Google has way more data and compute available, and also tons of cash to burn without depending on external funding).

But ChatGPT is an insane brand and for most (free) customers I don't think model capabilities (aka “intelligence”) are that important. So if they stopped training frontier models right now and focus on driving their costs low by optimizing their inference compute budget while serving ads, they can make a lot of money from their user base.

But that would probably mean losing most of its paying customers over the long run (companies won't be buying mediocre token at a premium for long) and more importantly it would require abandoning the AGI bullshit narrative, which I'm not sure Altman is willing to do. (And even if he was, how to do that without collapsing from lack of liquidity due to investors feeling betrayed is an open question).

TheOtherHobbes 4 minutes ago | parent | next [-]

Altman's main interest is Altman. ChatGPT will be acquihired, most people will be let go, the brand will become a shadow of its former self, and Altman will emerge with a major payday and no obvious dent in his self-made reputation as a leading AGIthinkfluenceretc.

I don't think ads are that easy, because the hard part of ads isn't taking money and serving up ad slop, it's providing convincing tracking and analytics.

As soon as ad slop appears a lot of customers will run - not all, but enough to make monetisation problematic.

bloppe 38 minutes ago | parent | prev | next [-]

The best way to drive inference cost down right now is to use TPUs. Either that or invest tons of additional money and manpower into silicon design like Google did, but they already have a 10 year lead there.

riffraff an hour ago | parent | prev [-]

> But ChatGPT is an insane brand

I mean, so was netscape.

cmiles8 an hour ago | parent | next [-]

This. Netscape was THE browser in the early phases of the Internet. Then Microsoft just packaged IE into Windows and it was game over. The brand means nothing long term. If Google broadly incorporates Gemini into all the Google-owned things everyone already has then it’s game over for OpenAI.

The mass commission of the tech is rapidly driving AI to be a feature, not a product. And Google is very strongly positioned to take advantage of that. Microsoft too, and of course they have a relationship with OpenAI but that’s fraying.

littlestymaar an hour ago | parent | prev [-]

Maybe, I was too young to remember that.

felixfurtak 3 hours ago | parent | prev | next [-]

All of which you will be able to do with your bundled assistant in the not-to-distant future.

OpenAI is a basket case:

- Too expensive and inconvenient to compete with commoditized, bundled assistants (from Google/ Microsoft/Apple)

- Too closed to compete with cheap, customizable open-source models

- Too dependent on partners

- Too late to establish its own platform lock-in

It echoes what happened to:

- Netscape (squeezed by Microsoft bundling + open protocols)

- BlackBerry (squeezed by Apple ecosystem + open Android OS)

- Dropbox (squeezed by iCloud, Google Drive, OneDrive + open tools like rclone)

When you live between giants and open-source, your margin collapses from both sides.

deathhand 39 minutes ago | parent [-]

So why does Salesforce still prosper? They are just a fancy database.

felixfurtak 11 minutes ago | parent | next [-]

Good question. Salesforce does well because they provide the application layer to the data.

The WWW in the 1990s was an explosion of data. To the casual observer, the web-browser appeared to be the internet. But it wasn't and in itself could never make money (See Netscape). The internet was the data.

The people who build the infrastructure for the WWW (Worldcom, Nortel, Cisco, etc.) found the whole enterprise to be an extremely loss-making activity. Many of them failed.

Google succeeded because it provided an application layer of search that helped people to navigate the WWW and ultimately helped people make sense of it. It helped people to connect with businesses. Selling subtle advertising along the way is what made them successful.

Facebook did the same with social media. It allowed people to connect with other people and monetized that.

Over time, as they became more dominant, the advertising got less subtle and then the income really started to flow.

jasondigitized 9 minutes ago | parent | prev [-]

Because they locked-in a ton of enterprise customers and have an army of certified consultants who build custom solutions for you.

tartoran 2 hours ago | parent | prev | next [-]

There's no doubt you're getting a lot of value from OpenAI, I am too. And yes the subscription is a lot more value than what you pay for. That's because they're burning investor's money and it's not something that is sustainable. Once the money runs out they'll have to jack up prices and that's the moment of truth, we'll see what users are willing to pay for what. Google or another company may be able to provide all that much cheaper.

munk-a 3 hours ago | parent | prev | next [-]

As a developer - ChatGPT doesn't hold a candle compared to claude for coding related tasks and under performs for arbitrary format document parsing[1]. It still has value and can handle a lot of tasks that would amaze someone in 2020 - but it is simply falling behind and spending much more doing so.

1. It actually under performs Claude, Gemini and even some of the Grok models for accuracy with our use case of parsing PDFs and other rather arbitrarily formatted files.

jfb an hour ago | parent | prev | next [-]

That the product is useful does not mean the supplier of the product has a good business; and of course, vice versa. OpenAI has a terrible business at the moment, and the question is, do they have a plausible path to a good one?

rglullis 3 hours ago | parent | prev | next [-]

> They're not paying me to use it.

Of course they are.

> As long as the inference is not done at a loss.

If making money on inference alone was possible, there would be a dozen different smaller providers who'd be taking the open weights models and offering that as service. But it seems that every provider is anchored at $20/month, so you can bet that none of them can go any lower.

FeepingCreature an hour ago | parent | next [-]

> If making money on inference alone was possible, there would be a dozen different smaller providers who'd be taking the open weights models and offering that as service.

There are! Look through the provider list for some open model on https://openrouter.ai . For instance, DeepSeek 3.1 has a dozen providers. It would not make any sense to offer those below cost because you have neither moat nor branding.

dragonwriter 2 hours ago | parent | prev [-]

> If making money on inference alone was possible

Maybe, but arguably a major reason you can't make money on inference right now is that the useful life of models is too short, so you can't amortize the development costs across much time because there is so much investment in the field that everyone is developing new models (shortening useful life in a competitive market) and everyone is simultaneously driving up the costs of inputs needed for developing models (increasing the costs that have to be amortized over the short useful life). Perversely, the AI bubble popping and resolving those issues may make profitability much easier for the survivors that have strong revenue streams.

mirthflat83 3 hours ago | parent | prev | next [-]

Well, don't you think you're getting a ton of value because they're selling each of their dollars for 0.2 dollars?

steveBK123 3 hours ago | parent | prev | next [-]

If the subscription cost 5x as much would you still pay and feel you are getting such a great value?

dosinga an hour ago | parent [-]

If there are no free alternatives, yes. 100 USD a month for ChatGPT seems great value

ReptileMan 3 hours ago | parent | prev [-]

>. As long as the inference is not done at a lost this analogy doesn't hold.

I think that there were some article here that claimed that even inference is done at loss - and talking about per subscriber. I think it was for their 200$ subscription.

In a way we will be in a deal with it situation soon where they will just impose metered models and not subscription.