Remix.run Logo
tefkah 9 hours ago

Theoretically it would be much less expensive to just continue to run the existing models, but ofc none of the current leaders are going to stop training new ones any time soon.

bombcar 7 hours ago | parent [-]

So are we on a hockey stick right now where a new model is so much better than the previous that you have to keep training?

Because almost every example of previous cases of things like this eventually leveled out.