| ▲ | aurareturn a day ago |
| This is an awful article. I don't know how it reached #1 on HN. Bottom line is that H100 prices are near 3 year highs, A100s are still profitable to run, B200 prices are increasing, no one has enough compute. Google, OpenAI, Anthropic, Meta, AWS, Azure are all compute constrained. Every single one of them said so publicly. Neo clouds are telling customers they're all sold out now and you even have to book compute in advance if you're an AI company. OpenAI is struggling to monetize. They turned to showing ads in ChatGPT, something Sam Altman once called a “last resort”, while Anthropic is crushing them with the more profitable corporate customers and software engineers.
AI bubble is bursting because OpenAI is trying to monetize free users on ChatGPT with ads but Anthropic is kicking butt in AI. What kind of logic is that? So it seems like AI can be monetized as Anthropic shows. Is AI going to burst because OpenAI can't monetize but Anthropic can? I wouldn’t be surprised at all if in the next couple of quarters we see OpenAI looking for an exit. It will be interesting because the sizes are now so big that we will probably know all the details. The most likely buyer is Microsoft, they already own a lot of it, and because of that, they are the most interested in showing a win.
I'll take the opposite stance. I think OpenAI is going to be bigger than Microsoft in market cap within the next 3 years. I think Anthropic and OpenAI are going to run laps around current big tech except maybe Google. For example, in a few years, I think AI agents could completely replace Microsoft Office, Microsoft's cash cow. Independent reports state that Claude metered models are priced 5x more expensive than their subscribers pay
Already dispelled. It isn't 5x more expensive than their subscribers pay. Inference has a gross margin of 50%+. It's been repeated over and over again by Anthropic CEO, OpenAI CEO, and just about anyone who's done deep analysis on token profitability. If you don't believe OpenAI and Anthropic CEOs, just look at inference providers on Openrouter. They don't have VCs backing them selling tokens at a loss. They should be making margins on every token in order to keep the lights on. |
|
| ▲ | doom2 a day ago | parent | next [-] |
| > Bottom line is that H100 prices are near 3 year highs, A100s are still profitable to run, B200 prices are increasing, no one has enough compute. Then why aren't the hardware manufacturers of components needed by AI companies making plans yesterday to bring new fabs online to meet demand? That isn't a gotcha question, I genuinely want to know. The money involved isn't that much compared to the money changing hands between Nvidia Microsoft, OpenAI, etc., and it's not like once in-progress data center construction is complete they won't need to buy more RAM and GPUs, especially with any new advances in technology that might happen. Inevitably someone will reply that hardware manufacturers don't want to be stuck losing money on a facility because the bubble popped and demand disappeared, but if Anthropic and OpenAI are going to "run laps around current big tech", it should be a no-brainer to increase production capacity. |
| |
| ▲ | jsnell a day ago | parent | next [-] | | A new fab will need to be filled with advanced equipment like lithography machines. They are the most complex thing humanity has every built. There is one supplier of EUV lithography machines in the world, ASML. They are basically acting as an integrator for hundreds of highly specialized components manufactured to unimaginable levels of precision. Each of them has roughly one eligible supplier in the world who are operating at full capacity. To expand, they'll need yet another set of specialized and almost impossible to build equipment. So the supply chain moves incredibly slowly, and the slowness is intrinsic due to the complexity and depth of the supply chain. It can't be fixed with just money. IIRC ASML is aiming to merely double their production of EUV lithography machines by 2030. | | |
| ▲ | doom2 a day ago | parent [-] | | Sure, I didn't mean to suggest that it would be easy or fast to increase manufacturing capabilities, just that the confidence I'm seeing around AI should extend to the manufacturers (if that confidence for the future growth and success of OpenAI and Anthropic is warranted). That is, the business decision to increase RAM and GPU supply should be "easy". | | |
| ▲ | jsnell a day ago | parent [-] | | Right, but the business decisions probably aren't the constraint at this point? (But were a year ago.) Once the ability of the supply chain to grow has been saturated, no amount of extra confidence will make it grow faster. |
|
| |
| ▲ | aurareturn a day ago | parent | prev [-] | | They are. They're making as many fabs as they can as fast as they can. The bottleneck is ASML, who can only make so many EUV machines. No one else can make EUV machines. Scaling chip fabs and chip equipment is much harder. And you have to understand that chip fabs go bankrupt if demand suddenly drops so they have to be more cautious by default. | | |
| ▲ | zozbot234 a day ago | parent [-] | | If you're really compute constrained do you really need EUV machines? You can make do with DUV fabrication nodes, albeit at somewhat higher cost. The trailing edge is where a lot of the mass impactful innovation is, e.g. trying to replicate more advanced EUV nodes with DUV multiple patterning. | | |
| ▲ | aurareturn 13 hours ago | parent | next [-] | | That’s what’s happening. Companies who were planning a move to advanced nodes for non AI chips are delaying it. All the advanced nodes are going to AI or smartphone chips only. | |
| ▲ | senordevnyc 15 hours ago | parent | prev [-] | | There was a good episode on Dwarkesh's podcast about this in the last few weeks, just a deep dive into the semiconductor industry and what the bottlenecks are. |
|
|
|
|
| ▲ | nunez a day ago | parent | prev | next [-] |
| > I think AI agents could completely replace Microsoft Office How? What do you think lawyers/government will use to write briefs? |
| |
|
| ▲ | veunes 6 hours ago | parent | prev | next [-] |
| OpenAI overtaking Microsoft? Seriously? Microsoft has a massively diversified business spanning from gaming and cloud infra to B2B software that the entire world runs on. OpenAI has exactly one product (matrix weights), which is getting heavily commoditized by open-source models every single day. Once a theoretical Llama 4 catches up to GPT-5, an API price war is going to completely nuke their hyper-margins |
|
| ▲ | the_gipsy a day ago | parent | prev | next [-] |
| > but Anthropic is kicking butt in AI that's not what the article said: > They turned to showing ads in ChatGPT, something Sam Altman once called a “last resort”, while Anthropic is crushing them |
| |
| ▲ | aurareturn a day ago | parent [-] | | Yes, that's what he said. He said AI is going to bust because OpenAI needs to put ads on free tier. Then he said Anthropic is doing great with enterprise customers. So which is it? Is AI going to burst because OpenAI needs to put ads on ChatGPT? Or is AI not going to burst because Anthropic is doing great in enterprise? The logic has glaring flaws. |
|
|
| ▲ | HackerThemAll a day ago | parent | prev | next [-] |
| > I think OpenAI is going to be bigger than Microsoft in market cap within the next 3 years. I am yet to see how a one-legged business model with just a single product (that is not crude oil), without a plan and money is going to become sustainable. Oh yeah, maybe they'll finally make money on those autonomous lethal weapons. That sounds the easiest. |
| |
| ▲ | aurareturn a day ago | parent [-] | | Sure. I'll give you a basic plan without any insider knowledge on OpenAI. First, OpenAI and Anthropic are the leaders in model capabilities. Google is a close 3rd but 3rd nonetheless. Second, ChatGPT likely has about 1 billion active users right now. I think ads on ChatGPT will surpass even Google search ads in the future. There will be a class of users who will never pay for ChatGPT subscriptions and that's ok. Meta and Google are two of the most profitable companies in history who almost rely solely on free users for their cash cows. "Ask ChatGPT" is already "google it" for the masses. Third, there is so much untapped revenue potential from science, medicine field that OpenAI can eventually own with Anthropic. Microsoft stands no chance here since they can't build competing models. Fourth, I can easily see ChatGPT morphing into agents for consumers and people will pay for them. AI is moving up the value chain fast. I don't see any reason why consumers won't pay for ChatGPT but will pay for Netflix. Just some basic ideas based on public knowledge. I'm sure there are plenty more. I'm not going to bet my house that OpenAI will become bigger than Microsoft in 3 years, but I'll put down a few hundred dollars on this bet. | | |
| ▲ | niam a day ago | parent | next [-] | | I don't discount this as a possibility but my impression is that the OpenAI brand isn't very sticky. Internet Explorer being pre-installed on Windows devices didn't prevent it from being demolished by newcomer Chrome throughout the 2010s. Now we're looking at a product that's even less integrated, and whose value is exposed through universal interfaces (human language, images, etc.). If OpenAI succeeds, I imagine that remarkably little of it will have come from the brand. But subtracting the first-mover brand advantage: they can either compete on the frontier, which seems difficult and bears potentially diminishing returns (particularly wrt to distillation); or compete as a commodity, which I imagine cannot justify their valuation/spend. It seems very uphill of a battle. | | |
| ▲ | fragmede 21 hours ago | parent [-] | | For people that use ChatGPT the same way you do, yeah it's not. For people in the throes of AI psychosis who've named their ChatGPT and have a deep relationship with it, switching to a newer model from OpenAI is an issue, nevermind switching to a different model from a different company. | | |
| ▲ | niam 21 hours ago | parent | next [-] | | I considered that but I don't see it being very impactful. It presumes a user who cares enough about "their" ChatGPT that they can't move from a particular model provider, but simultaneously does not care enough that model providers themselves have a financial motivation to shoo users onto their newer and more efficient models. The transition from GPT4 to GPT5 was not well recieved among this crowd -- nevermind that I think this crowd is pretty small (comparatively) to begin with. I just don't imagine you can build a business on that sliver of a sliver, much less one that justifies OpenAI's spending. | |
| ▲ | d2ssa 19 hours ago | parent | prev [-] | | Most people dont give a hoot about that, they have much more interesting stuff going on in life. |
|
| |
| ▲ | d2ssa 19 hours ago | parent | prev [-] | | "Third, there is so much untapped revenue potential from science, medicine field that OpenAI can eventually own with Anthropic. " Lol... yeah. They are not even looking like a going-concern long enough at this rate, let alone that. |
|
|
|
| ▲ | ratrace 18 hours ago | parent | prev [-] |
| [dead] |