Remix.run Logo
hansmayer 4 days ago

Sure, feel free to break down the numbers.

93po 4 days ago | parent [-]

In 2024 they had a $5 billion loss. About $3b of that was training. $1.5b was employees. I'm sure there's at least another $0.5b of costs associated to building out rather than just serving inference. In reality it's probably several times that. So if you cut employees to just maintaining what they have, fire all researchers etc, stop expansion, and stop training, you'd be profitable. Which is dumb and they wouldn't do that, but my point isn't that it's realistic, but rather that they could sell what they have at a profit if they wanted to.

hansmayer 4 days ago | parent [-]

So they could be profitable, but the conditions to achieve the profitability are dumb and unrealistic. Your own words. Somehow you claim to have still made your point, because a company firing all its employees and stopping all product development could be profitable, right? Because thats what companies do routinely, they just maximise profits by firing everyone once the product is mature enough and can practically take care of itself. I wonder why all the e-commerce companies just dont apply this one simple trick? Is that the argument that you are making? Now for the calculations - are you sure the losses are only 5B? Well, if we just account for the Microsoft donated Azure credits, they run a lot of their workloads on, its probably a lot, lot more than that. Unaccounted for in the OpenAI books perhaps, but still a huge material investment, that does not make any returns to anyone, hence a (by definition) loss.

93po 3 days ago | parent [-]

I'm not sure what your original point was.

Either it's that serving AI as a business model is impossible to run at a profit, which I easily demonstrated is not the case. If it's just serving the model, then yes, it works, and there's tons of businesses doing just that and operating at a profit.

Or is that's the expense of evening running a GPU to serve a model is not worth the value that the model running on the GPU is capable of making, which is demonstrably not true, given that people are paying anywhere from dozens to hundreds of dollars a month, and there is an eventual payback period for both the cost of the hardware and electricity there.

hansmayer 3 days ago | parent [-]

I think it was on you to make a point here, not me. What is it that you demonstrated? I only saw a lot of creative imagination and "could be-would be" scenarios.