| ▲ | mediaman 3 hours ago |
| How? They are all losing tens of billions of dollars on this, so far. Open source models are available at highly competitive prices for anyone to use and are closing the gap to 6-8 months from frontier proprietary models. There doesn't appear to be any moat. This criticism seems very valid against advertising and social media, where strong network effects make dominant players ultra-wealthy and act like a tax, but the AI business looks terrible, and it appears that most benefits are going to accrue fairly broadly across the economy, not to a few tech titans. NVIDIA is the one exception to that, since there is a big moat on their business, but not clear how long that will last either. |
|
| ▲ | TheColorYellow 3 hours ago | parent | next [-] |
| I'm not so sure thats correct. The Labs seem to offer the best overall products in addition to the best models. And requirements for models are only going to get more complex and stringent going forward. So yes, open source will be able to keep up from a pure performance standpoint, but you can imagine a future state where only licensed models are able to be used in commercial settings and licensing will require compliance against limiting subversive use or similar (e.g. sexualization of minors, doesn't let you make a bomb etc.). When the market shifts to a more compliance-relevant world, I think the Labs will have a monopoly on all of the research, ops, and production know-how required to deliver. That's not even considering if Agents truly take off (which will then place a premium on the servicing of those agents and agent environments rather than just the deployment). There's a lot of assumptions in the above, and the timelines certainly vary, so its far from a sure thing - but the upside definitely seems there to me. |
| |
| ▲ | tru3_power 2 hours ago | parent | next [-] | | What’s the purpose of licensing requiring though things though if someone could just use an open source model to do that anyway? If someone were going to do those things you mentioned why do it through some commercial enterprise tool? I can see maybe licensing requiring a certain level of hardening to prevent prompt injections, but ultimately it still really comes down to how much power you give the model in whatever context it’s operating in. | |
| ▲ | cj 2 hours ago | parent | prev [-] | | If that's the case, the winner will likely be cloud providers (AWS, GCP, Azure) who do compliance and enterprise very well. If Open Source can keep up from a pure performance standpoint, any one of these cloud providers should be able to provide it as a managed service and make money that way. Then OpenAI, Anthropic, etc end up becoming product companies. The winner is who has the most addictive AI product, not who has the most advanced model. |
|
|
| ▲ | gizmodo59 3 hours ago | parent | prev | next [-] |
| Nvda is not the only exception. Private big names are losing money but there are so many public companies seeing the time of their life. Power, materials, dram, storage to name a few. The demand is truly high. What we can argue about is if AI is truly transforming lives of everyone, the answer is a no. There is a massive exaggeration of benefits. The value is not ZERO. It’s not 100. It’s somewhere in between. |
| |
| ▲ | CrossVR 2 hours ago | parent [-] | | I believe that eventually the AI bubble will evolve in a simple scheme to corner the compute market. If no one can afford high-end hardware anymore then the companies who hoarded all the DRAM and GPUs can simply go rent seeking by selling the computer back to us at exorbitant prices. | | |
| ▲ | mikestorrent 2 hours ago | parent | next [-] | | The demand for memory is going to result in more factories and production. As long as demand is high, there's still money to be made in going wide to the consumer market with thinner margins. What I predict is that we won't advance in memory technology on the consumer side as quickly. For instance, a huge number of basic consumer use cases would be totally fine on DDR3 for the next decade. Older equipment can produce this; so it has value, and we may see platforms come out with newer designs on older fabs. Chiplets are a huge sign of growth in that direction - you end up with multiple components fabbed on different processes coming together inside one processor. That lets older equipment still have a long life and gives the final SoC assembler the ability to select from a wide range of components. https://www.openchipletatlas.org/ | |
| ▲ | digiown 2 hours ago | parent | prev [-] | | That makes no sense. If the bubble bursts, there will be a huge oversupply and the prices will fall. Unless all Micron, Samsung, Nvidia, AMD, etc all go bankrupt overnight, the prices won't go up when demand vanishes. | | |
| ▲ | charcircuit 2 hours ago | parent [-] | | There is a massive undersupply of compute right now for the current level of AI. The bubble bursting doesn't fix that. |
|
|
|
|
| ▲ | charcircuit 3 hours ago | parent | prev | next [-] |
| >losing tens of billions They are investing 10s of billions. |
| |
| ▲ | bigstrat2003 2 hours ago | parent | next [-] | | They are wasting tens of billions on something that has no business value currently, and may well never, just because of FOMO. That's not what I would call an investment. | | |
| ▲ | charcircuit 7 minutes ago | parent [-] | | Many investments may lose money, but the EV here is positive due to the extreme utility that AI can and is bringing. |
| |
| ▲ | bandrami 2 hours ago | parent | prev [-] | | They are washing 10s of billions of dollars an an industry-wide attempt to keep the music playing |
|
|
| ▲ | gruez 3 hours ago | parent | prev | next [-] |
| >Open source models are available at highly competitive prices for anyone to use and are closing the gap to 6-8 months from frontier proprietary models. What happens when the AI bubble is over and developers of open models doesn't want to incinerate money anymore? Foundation models aren't like curl or openssl. You can't have maintain it with a few engineer's free time. |
| |
| ▲ | compounding_it 2 hours ago | parent | next [-] | | Training is really cheap compared to the basically free inference being handed out by openai Anthropic Google etc. Spending a million dollars on training and giving the model for free is far cheaper than hundreds of millions of dollars spent on inference every month and charging a few hundred thousand for it. | | |
| ▲ | mikestorrent 2 hours ago | parent [-] | | Not sure I totally follow. I'd love to better understand why companies are open sourcing models at all. |
| |
| ▲ | edoceo 2 hours ago | parent | prev [-] | | If the bubble is over all the built infrastructure would become cheaper to train on? So those open models would incenerate less? Maybe there is an increase of specialist models? Like after dot-com the leftovers were cheap - for a time - and became valuable (again) later. | | |
| ▲ | bandrami 2 hours ago | parent [-] | | No, if the bubble ends the use of all that built infrastructure stops being subsidized by an industry-wide wampum system where money gets "invested" and "spent" by the same two parties. |
|
|
|
| ▲ | yowlingcat 3 hours ago | parent | prev | next [-] |
| I agree with your point and it is to that point I disagree with GP. These open weight models which have ultimately been constructed from so many thousands of years of humanity are also now freely available to all of humanity. To me that is the real marvel and a true gift. |
|
| ▲ | fHr 2 hours ago | parent | prev | next [-] |
| The other side of the market: |
|
| ▲ | ulfw 3 hours ago | parent | prev [-] |
| It's turning out to be a commodity product. Commodity products are a race to the bottom on price. That's how this AI bubble will burst. The investments can't possibly show the ROIs envisioned. As an LLM I use whatever is free/cheapest. Why pay for ChatGPT if Copilot comes with my office subscription? It does the same thing. If not I use Deepseek or Qwen and get very similar results. Yes if you're a developer on Claude Code et al I get a point. But that's few people. The mass market is just using chat LLMs and those are nothing but a commodity. It's like jumping from Siri to Alexa to whatever the Google thing is called. There are differences but they're too small to be meaningful for the average user |