Remix.run Logo
jillesvangurp 3 hours ago

People talk about an AI bubble. What we actually have is a GPU bubble. NVidia makes really expensive GPUs for AI. Others also make GPUs.

Companies like Google produce and operate AI models largely using their own TPUs rather than NVidia's GPUs. We've seen the Chinese produce pretty competitive open models with either older NVidia GPUs or alternative GPUs because they are not allowed to buy the newer ones. And AMD, Intel and other chip makers are also eager to get in on the action. Companies like Microsoft, Amazon, etc. have their own chips as well (similar to Google). All the hyperscalers are moving away from NVidia.

And then Apple runs a non Intel and non NVidia based range of workstations and laptops that are pretty popular with AI researchers because the M series CPU/GPU/NPU is pretty decent value for running AI models. You see similar movement with ARM chips from Qualcomm and others. They all want to run AI models on phones, tablets, laptops. But without NVidia.

NVidia's bubble is about vastly overcharging for a thing that only they can provide. Their GPU chips have enormous margins relative to CPU chips coming out of the same/similar machines. That's a bubble. As soon as you introduce competition, the companies with the best price performance wins. NVidia is still pretty good at what they do. But not enough to justify an order of magnitude price/cost difference.

NVidia's success has been predicated on its proprietary software and instruction set (CUDA). That's a moat that won't last. The reason Google can use its own TPUs rather than CUDA is that it worked hard to get rid of their CUDA dependence. Same for the other hyperscalars. At this point they can do training and inference without CUDA/NVidia and its more cost effective.

The reason that this 100B deal is apparently being reconsidered is that it is a bad deal for OpenAI. It was going to overpay for a solution that they can get cheaper elsewhere. It's bad news for NVidia, good news for OpenAI. This deal started out with just NVidia. But at this point there are also deals with AMD, MS, and others. OpenAI like the other hyperscalers is not betting the company on NVidia/CUDA. Good for them.

catdog 2 hours ago | parent [-]

> People talk about an AI bubble. What we actually have is a GPU bubble. NVidia makes really expensive GPUs for AI. Others also make GPUs.

Yes it is. I think even for multiple reasons. Competition in that space not sleeping is one but it's also a huge overestimation of demand combined with the questionable believe those GPUs and the Datacenters housing them can actually be built and put into operation as fast as envisioned.

> The reason that this 100B deal is apparently being reconsidered is that it is a bad deal for OpenAI. It was going to overpay for a solution that they can get cheaper elsewhere. It's bad news for NVidia, good news for OpenAI. This deal started out with just NVidia. But at this point there are also deals with AMD, MS, and others. OpenAI like the other hyperscalers is not betting the company on NVidia/CUDA. Good for them.

I think in case of OpenAI both may be true. While what you are saying makes sense, NVs first mover advantage obviously can't last forever, OpenAI currently does have little to no competitive advantage over other players. Combine this with the fact that some (esp. Google) sit on a huge pile of cash. In contrast for OpenAI the party is pretty much over as soon as investors stop throwing money into the oven so they might need to cut back a bit.