Remix.run Logo
piker 2 hours ago

The post is factoring in training costs, not just inference.

dwohnitmok 19 minutes ago | parent | next [-]

No it's not. Otherwise this part doesn't make sense

> in fact, they actually compound the problem by encouraging significantly more usage

because if eliminating training costs makes running the model above cost, the problem is helped by significantly more usage not compounded.

More usage compounds the problem only if inference is unprofitable.

(the article briefly mentions training but that's later).

johnfn 2 hours ago | parent | prev [-]

But I don't need to pay training costs to use GLM-5?

piker 2 hours ago | parent [-]

Sure, but somebody needs to pay for GLM-6 unless you're happy to stop here.

InsideOutSanta 2 hours ago | parent [-]

If everybody stopped training models today and Anthropic and OpenAI were deleted from the universe, I'd be happy to just keep using GLM-5 at its current inference cost. The article's author assumes that there will be a point where we will no longer have access to good models at reasonable cost because current models are subsidized, but GLM-5 disproves that.