| ▲ | famouswaffles 9 hours ago | |||||||
>But OpenAI doesn't have tiny COGS: inference is expensive as fuck. No, inference is really cheap today, and people saying otherwise simply have no idea what they are talking about. Inference is not expensive. | ||||||||
| ▲ | cherryteastain 2 hours ago | parent [-] | |||||||
Clearly not cheap enough. > Even at $200 a month for ChatGPT Pro, the service is struggling to turn a profit, OpenAI CEO Sam Altman lamented on the platform formerly known as Twitter Sunday. "Insane thing: We are currently losing money on OpenAI Pro subscriptions!" he wrote in a post. The problem? Well according to @Sama, "people use it much more than we expected." | ||||||||
| ||||||||