Remix.run Logo
spongebobstoes 4 days ago

why do you see a "clear directionality" leading to ads? this is not obvious to me. chatgpt is not social media, they do not have to monetize in the same way

they are making plenty of money from subscriptions, not to count enterprise, business and API

rrrrrrrrrrrryan 4 days ago | parent | next [-]

Altman has said numerous times that none of the subscriptions make money currently, and that they've been internally exploring ads in the form of product recommendations for a while now.

simianwords 4 days ago | parent [-]

Source? First time I’ve heard of it.

mtmail 4 days ago | parent [-]

"We haven't done any advertising product yet. I kind of...I mean, I'm not totally against it. I can point to areas where I like ads. I think ads on Instagram, kinda cool. I bought a bunch of stuff from them. But I am, like, I think it'd be very hard to…I mean, take a lot of care to get right."

https://mashable.com/article/openai-ceo-sam-altman-open-to-a...

simianwords 4 days ago | parent [-]

> Altman has said numerous times that none of the subscriptions make money currently

For this

rrrrrrrrrrrryan 3 days ago | parent [-]

He's said even the pro plan is losing money:

https://x.com/sama/status/1876104315296968813

0xCMP 4 days ago | parent | prev | next [-]

One has a more obvious route to building a profile directly off that already collected data.

And while they are making lots of revenue even they have admitted on recent interviews that ChatGPT on it's own is still not (yet) breakeven. With the kind of money invested, in AI companies in general, introducing very targeted Ads is an obvious way to monetize the service more.

simianwords 4 days ago | parent [-]

This is incorrect understanding of unit economics. They are not breaking even only because of reinvestment into r and d.

0xCMP 3 days ago | parent [-]

Sam Altman said in an on-the-record dinner interview with Platformer[0] that besides R&D ChatGPT was breakeven and Brad Lightstep, head of ChatGPT, corrected him by saying they were close, but not yet break even.

I assume Sam and Brad both understand the unit economics of their product.

Article is pay-walled for me, but I heard it on their podcast[1]. Which somehow I heard fine, but that page is getting pay-walled for me.

[0]: https://www.platformer.news/sam-altman-gpt-5-interview-light... [1]: https://www.nytimes.com/2025/08/15/podcasts/hardfork-gpt5-pe...

biophysboy 4 days ago | parent | prev | next [-]

Presumably they would offer both models (ads & subscriptions) to reach as many users as possible, provided that both models are net profitable. I could see free versions having limits to queries per day, Tinder style.

Geezus_42 4 days ago | parent | prev | next [-]

None of the "AI" companies are profitable currently. Everything devolves into selling ADs eventually. What makes you think LLMs are special?

ankit219 4 days ago | parent | prev | next [-]

The router introduced in gpt-5 is probably the biggest signal. A router, while determining which model to route query, can determine how much $$ a query is worth. (Query here is conversation). This helps decide the amount of compute openai should spend on it. High value queries -> more chances of affiliate links + in context ads.

Then, the way memory profile is stored is a clear way to mirror personalization. Ads work best when they are personalized as opposed to contextual or generic. (Google ads are personalized based on your profile and context). And then the change in branding from being the intelligent agent to being a companion app. (and hiring of fidji sumo). There are more things here, i just cited a very high level overview, but people have written detailed blogs on it. I personally think affiliate links they can earn from aligns the incentive for everyone. They are a kind of ads, and thats the direction they are marching towards .

tedsanders 4 days ago | parent [-]

I work at OpenAI and I'm happy to deny this hypothesis.

Our goal for the router (whether you think we achieved it or not) was purely to make the experience smoother and spare people from having to manually select thinking models for tasks that benefit from extra thinking. Without the router, lots of people just defaulted to 4o and never bothered using o3. With the router, people are getting to use the more powerful thinking models more often. The router isn't perfect by any means - we're always trying to improve things - but any paid user who doesn't like it can still manually select the model they want. Our goal was always a smoother experience, not ad injection or cost optimization.

ankit219 4 days ago | parent [-]

Hi! Thank you for the clarification. I was just saying it might be possible in the future (in a way you can determine how much compute - which model - a specific query needs today as well). And the experience has definitely improved w router so kudos on that. I don't know what the final form factor of ads would be (i imagine it turning out to be a win win win scenario than say you show ads at the expense of quality. This is a google level opportunity to invent something new) just that it seems from the outside you guys are preparing for monetization by ads given the large userbase you have and virtually no competition at chatgpt usage level.

dweinus 4 days ago | parent | prev [-]

> they are making plenty of money from subscriptions, not to count enterprise, business and API

...except that they aren't? They are not in the black and all that investor money comes with strings