Remix.run Logo
buttered_toast 3 hours ago

Can't say, just seems implausible, but I am a nobody anyways ¯\_(ツ)_/¯

verdverm an hour ago | parent [-]

I'm pretty sure it is widely known that the early 5.x series were built from 4.5 (unreleased). It seems more plausible the 5.x series is still in that continuation.

For some extra context, pre-training is ~1/3 of the training, where it gains the basic concepts of how tokens go together. Mid & late training are where you instill the kinds of anthropic behaviors we see today. I expect pre-training to increasingly become a lower percentage of overall training, putting aside any shifts of what happens in each phase.

So to me, it is plausible they can take the 4.x pre-training and keep pushing in the later phases. There is a lot of results out there to show scaling laws (limits) have not peaked yet. I would not be surprised to learn that Gemini 3 Deep Research had 50% late-training / RL

buttered_toast an hour ago | parent [-]

Okay I see what you mean, and yeah that sounds reasonable too. Do you have any context on that first part? I would like to know more about how/why they might not have been able to pursue more training runs.

verdverm 23 minutes ago | parent [-]

I have not done it myself (don't have the dinero), but my understanding is that there are many runs, restarts, and adjustments at this phase. It's surprisingly more fragile than we know aiui

If you already have a good one, it's not likely much has changed since a year ago that would create meaningful differences at this phase (in data, arch is diff, I know less here). If it is indeed true, it's a datapoint to add to the others singling internal (everybody has some amount of this, not good when it makes the headlines)

Distillation is also a powerful training method. There are many ways to stay with the pack without having new pre-training runs. It's pretty much what we see from all of them with the minor versions. So coming back to it, the speculation is that OpenAi is still on their 4.x pre-train, but that doesn't impede all progress