Remix.run Logo
phillipcarter 5 days ago

Uhhh, I'm pretty sure DeepSeek shook the industry because of a 14x reduction in training cost, not inference cost.

We also don't know the per-token cost for OpenAI and Anthropic models, but I would be highly surprised if it was significantly more expensive than open models anyone can use and run themselves. It's not like they're also not investing in inference research.

gmd63 5 days ago | parent | next [-]

DeepSeek was trained with distillation. Any accurate estimate of training costs should include the training costs of the model that it was distilling.

ffsm8 5 days ago | parent [-]

That makes the calculation nonsensical, because if you go there... you'd also have to include all energy used in producing the content the other model providers used. So now suddenly everyones devices on which they wrote comments on social media, pretty much all servers to have ever served a request to open AI/Google/anthropics bots etc pp

Seriously, that claim was always completely disingenuous

gmd63 5 days ago | parent | next [-]

I don't think it's that nonsensical to realize that in order to have AI, you need generations of artists, journalists, scientists, and librarians to produce materials to learn from.

And when you're using an actual AI model to "train" (copy), it's not even a shred of nonsense to realize the prior model is a core component of the training.

jaakl 4 days ago | parent | prev [-]

Not just energy cost, but also licensing cost of all this content…

andai 5 days ago | parent | prev | next [-]

Isn't training cost a function of inference cost? From what I gathered, they reduced both.

I remember seeing lots of videos at the time explaining the details, but basically it came down to the kind of hardware-aware programming that used to be very common. (Although they took it to the next level by using undocumented behavior to their advantage.)

booi 5 days ago | parent [-]

They're typically somewhat related but the difference between training and inference can vary greatly so, i guess the answer is no.

they did reduce both though and mostly due to reduced precision

baxtr 5 days ago | parent | prev [-]

Because of the alleged reduction in training costs.

basilgohar 5 days ago | parent [-]

All reports by companies are alleged until verified by other, more trustworthy sources. I don't think it's especially notable that it's alleged because it's DeepSeek vs. the alleged numbers from other companies.