| ▲ | throwaway314155 2 days ago | |
> which presumably hasn't done a fresh pre-training over the web What makes you think that? > Did they figure out how to do more incremental knowledge updates somehow? It's simple. You take the existing model and continue pretraining with newly collected data. | ||
| ▲ | Workaccount2 2 days ago | parent [-] | |
A leak reported on by semi-analyses stated that they haven't pre-trained a new model since 4o due to compute constraints. | ||