Remix.run Logo
Lalabadie 5 hours ago

> $14B in AI terms is like 'second-best VSCode fork' tier money

Yep, and the comparison relies on key people believing the valuations.

Lots of mature companies will want their providers to be reasonably sheltered from the fallout of a coming US AI bubble burst.

SpicyLemonZest 4 hours ago | parent [-]

How would that work? What it means for the US AI bubble to burst is that tremendous amounts of inference capacity become open for pennies on the dollar. I don't see how Mistral is or could be sheltered from that.

torginus 3 hours ago | parent | next [-]

Btw, this makes a great argument for workers' rights - if you have a company who owns datacenters - well, you can't fire your GPUs to make your Q2 look better

Lalabadie 4 hours ago | parent | prev | next [-]

SLAs that are valuable to their clients, guarantees and mechanisms to protect them from data exfiltration, and generally long-term contracts with cash-stable orgs like they're currently doing.

So long as they're sufficiently liquid at the right time, they don't really need to shelter more. They need to plan for a fire sale on the bulk of their operating expenses.

SpicyLemonZest 3 hours ago | parent [-]

It's extremely hard to plan for a fire sale on the bulk of your operating expenses when all of your customers can see the fire sale happening and know they're now paying you way too much. That's the whole intuition of a general "US AI bubble"; if OpenAI filed for bankruptcy tomorrow, most people expect that would be a crisis for Anthropic and Gemini rather than a windfall opportunity to pick up their compute for cheap.

Lalabadie 3 hours ago | parent [-]

The crash would come from being unable to fulfill financial engagements when total real income + funding fails to keep up with spending, and does it enough that the valuation mirage starts to fade.

What that reveals is the loaded cost of inference being more expensive than they've been showing, not cheaper. The crash would be the end of subsidized costs to users, not the revelation that it's a high-margin operation.

Selling compute/inference at more of a loss will probably not fly in the context of bankruptcy manoeuvers. They will need to shed spending engagements instead. I imagine Mistral would rather buy out some of their Nvidia purchase agreements for a discount if they want to build additional capacity at that time. I also don't think they'd be interested in US datacenters at all. If they want them they can get that in Canada, with a better ally and less political + financial risks, which is kind of the Mistral segment already.

tgv 3 hours ago | parent | prev | next [-]

Isn't running the models for end users the biggest cost at the moment?

tmikaeld 2 hours ago | parent [-]

Running the models is a tiny fraction of the cost. The cost is all on training the new models

pyrale 3 hours ago | parent | prev [-]

> tremendous amounts of inference capacity become open for pennies on the dollar.

They can't be operated for pennies on the dollar, though. The likely current status is that these products are subsidized to disregard model cost, and part of the operating cost.

If the bubble bursts, inference that can't be made profitable when factoring in operating costs will be scraped, not sold for pennies.

SpicyLemonZest 3 hours ago | parent [-]

I don't necessarily agree that's likely, but is it even the case that Mistral is more expensive than GPT or Claude? My understanding is that it's cheaper, which means it would fare worse in the scenario you're describing, unless they've perfectly calibrated the quality-cost tradeoff better than any American company.

pyrale 3 hours ago | parent [-]

All they have to do to survive is have enough cash flow to pay for their operating costs.

By providing specialized long-term services to corporate clients, they are securing exactly that.