Remix.run Logo
famouswaffles 9 hours ago

>But OpenAI doesn't have tiny COGS: inference is expensive as fuck.

No, inference is really cheap today, and people saying otherwise simply have no idea what they are talking about. Inference is not expensive.

cherryteastain 2 hours ago | parent [-]

Clearly not cheap enough.

> Even at $200 a month for ChatGPT Pro, the service is struggling to turn a profit, OpenAI CEO Sam Altman lamented on the platform formerly known as Twitter Sunday. "Insane thing: We are currently losing money on OpenAI Pro subscriptions!" he wrote in a post. The problem? Well according to @Sama, "people use it much more than we expected."

https://www.theregister.com/2025/01/06/altman_gpt_profits/

aurareturn an hour ago | parent [-]

So just raise the price or decrease the cost per token internally.

Altman also said 4 months ago:

  Most of what we're building out at this point is the inference [...] We're profitable on inference. If we didn't pay for training, we'd be a very profitable company.
https://simonwillison.net/2025/Aug/17/sam-altman/