Remix.run Logo
Danox 4 hours ago

But isn’t training models, a forever task like iterating in tech you can never take a day off, adding humans to the equation don’t humans train/teach themselves new skills over a lifetime, and isn’t one of the selling points in the future when selling this AI slop your AI never goes to sleep and can always be trained forever? The AI price for entry as we go on into the future will only increase.

atq2119 2 hours ago | parent | next [-]

I agree that training is a forever task, and the current rate of training is probably not sustainable. But all that means is that once the current investment mania ends, the market will most likely find a new equilibrium where continuous training still happens, but at a slower rate that can be sustained by inference revenue.

tonfa 26 minutes ago | parent [-]

> but at a slower rate that can be sustained by inference revenue. reply

also it's possible that the scale of inference needed (e.g. Jevons paradox) keeps growing to the point that training costs can fully be absorbed (since training cost is one off vs. inference that can scale).

(I suspect that might be the thinking, I don't know if it will be true, it's also possible that no model will create a moat big enough to attract enough of the inference traffic to make it true).

Depending on the chips/architecture used, the off-peak traffic from inference can also subsidize the training costs.

asjir 3 hours ago | parent | prev [-]

Just keeping it up to date with competitors is much cheaper, by copying better ones like Qwen did with Claude. Also a bunch of research is trickling into open source / arxiv so catching up should continue becoming cheaper at least as a fraction of training from scratch