| ▲ | FergusArgyll 2 days ago | |
> This stands in sharp contrast to rivals: OpenAI’s leading researchers have not completed a successful full-scale pre-training run that was broadly deployed for a new frontier model since GPT-4o in May 2024, highlighting the significant technical hurdle that Google’s TPU fleet has managed to overcome. - https://newsletter.semianalysis.com/p/tpuv7-google-takes-a-s... It's also plainly obvious from using it. The "Broadly deployed" qualifier is presumably referring to 4.5 | ||
| ▲ | ric2b 5 hours ago | parent [-] | |
How is that a technical hurdle if they obviously were able to do it before? It's probably just a question of cost/benefit analysis, it's very expensive to do, so the benefits need to be significant. | ||