▲ | grim_io 4 days ago | |
Google shows that Nvidia is not necessary. How long until more follow? | ||
▲ | NitpickLawyer 4 days ago | parent | next [-] | |
Tbf, goog started a long time ago with their TPUs. And they've had some bumps along the way. It's not as easy as one might think. There are certainly efforts to produce alternatives, but it's not an easy task. Even the ASIC-like providers like cerberas and groq are having problems with large models. They seemed very promising with SLMs, but once MoEs became a thing they started to struggle. | ||
▲ | ivape 4 days ago | parent | prev | next [-] | |
I don't think we can say that until we hear how Genie3 and Veo3 were trained. My hunch is that the next-gen multi-modal models that combine world, video, text, and image models can only be trained on the best chips. | ||
▲ | arthurcolle 4 days ago | parent | prev [-] | |
I agree in principle but you can't just yolo fab TPUs and leapfrog google |