| ▲ | elgatolopez 2 days ago | |||||||
Where did you get that from? Cutoff date says august 2025. Looks like a newly pretrained model | ||||||||
| ▲ | SparkyMcUnicorn 2 days ago | parent | next [-] | |||||||
If the pretraining rumors are true, they're probably using continued pretraining on the older weights. Right? | ||||||||
| ||||||||
| ▲ | FergusArgyll 2 days ago | parent | prev [-] | |||||||
> This stands in sharp contrast to rivals: OpenAI’s leading researchers have not completed a successful full-scale pre-training run that was broadly deployed for a new frontier model since GPT-4o in May 2024, highlighting the significant technical hurdle that Google’s TPU fleet has managed to overcome. - https://newsletter.semianalysis.com/p/tpuv7-google-takes-a-s... It's also plainly obvious from using it. The "Broadly deployed" qualifier is presumably referring to 4.5 | ||||||||
| ||||||||