Remix.run Logo
vessenes 4 days ago

You misunderstand.

"I built a ship to go to the Indies and bring back tea."

"Bro, the ship cost 100,000 pounds sterling and only brought back 50,000 pounds of tea. I don't care if you paid 12,500 pounds for the tea itself, you're losing money."

There is a very rational reason labs are spending everything they can get for more compute right now. The tea (inference) pays 60%+ margins. And that is rising. And that number is AFTER hyper scalars make their margins. There is an immense amount of profit floating around this system, and strategics at the edge believing they can build and control the demand through combined spend on training and inference in the proper ratios.

SpicyLemonZest 4 days ago | parent [-]

60%+ margins according to numbers which are not published publicly and have not AFAICT been audited.

Could they be accurate? Sure, I think people who claim this is impossible are overconfident. But I would encourage anyone who assumes they must be right to read a history of the Worldcom scandal. It's really quite easy for a person who wants to be making money (or an LLM who's been instructed to "run the accounts make no mistakes"!) to incorrectly categorize costs as capital investments when nobody's watching carefully.

vessenes 3 days ago | parent [-]

Any materially false public statement by one of the foundation lab CEOs is a huge foot fault. I'm not saying they would never lie, but it would be a very, very dumb thing to do. That public information can be relied on by their private (very powerful) investors. I think if you're hearing these numbers ballparked in public settings, they are, as a prior, directionally accurate.

SpicyLemonZest 3 days ago | parent [-]

I agree, although I would emphasize that Worldcom is a great example of a CEO doing that very dumb thing. But I am not hearing these numbers ballparked in public settings. As far as I can tell, all the numbers people discuss for OpenAI or Anthropic margins come from anonymous leaks of internal documents.