Remix.run Logo
PeterHolzwarth 13 hours ago

Yes, but that is the standard methodology for startups in their boost phase. Burn vast piles of cash to acquire users, then find out at the end if a profitable business can be made of it.

EA-3167 12 hours ago | parent | next [-]

It’s also the standard methodology for a number of scams.

mjamesaustin 8 hours ago | parent [-]

Scams are our entire economy now. Do whatever you can to own a market, then squeeze your customers miserably once you have their loyalty. Cash out, kick the smoking remains of the company to the curb, use your payout to buy into another company, and repeat.

lotsofpulp 3 hours ago | parent [-]

Kind of like paying more and more to Social Security and Medicare and getting less and less.

And the backstop on asset prices at the expense of the currency's purchasing power.

Analemma_ 11 hours ago | parent | prev [-]

Most startups have big upfront capital costs and big customer acquisition costs, but small or zero marginal costs and COGS, and eventually the capital costs can slow down. That's why spending big and burning money to get a big customer base is the standard startup methodology. But OpenAI doesn't have tiny COGS: inference is expensive as fuck. And they can't stop capex spending on training because they'll be immediately lapped by the other frontier labs.

The reason people are so skeptical is that OpenAI is applying the standard startup justification for big spending to a business model where it doesn't seem to apply.

famouswaffles 10 hours ago | parent [-]

>But OpenAI doesn't have tiny COGS: inference is expensive as fuck.

No, inference is really cheap today, and people saying otherwise simply have no idea what they are talking about. Inference is not expensive.

cherryteastain 4 hours ago | parent [-]

Clearly not cheap enough.

> Even at $200 a month for ChatGPT Pro, the service is struggling to turn a profit, OpenAI CEO Sam Altman lamented on the platform formerly known as Twitter Sunday. "Insane thing: We are currently losing money on OpenAI Pro subscriptions!" he wrote in a post. The problem? Well according to @Sama, "people use it much more than we expected."

https://www.theregister.com/2025/01/06/altman_gpt_profits/

aurareturn 3 hours ago | parent [-]

So just raise the price or decrease the cost per token internally.

Altman also said 4 months ago:

  Most of what we're building out at this point is the inference [...] We're profitable on inference. If we didn't pay for training, we'd be a very profitable company.
https://simonwillison.net/2025/Aug/17/sam-altman/