Remix.run Logo
davidguetta 4 hours ago

Implementation and Sustainability Hardware: Gemini 3 Pro was trained using Google’s Tensor Processing Units (TPUs). TPUs are specically designed to handle the massive computations involved in training LLMs and can speed up training considerably compared to CPUs. TPUs often come with large amounts of high-bandwidth memory, allowing for the handling of large models and batch sizes during training, which can lead to better model quality. TPU Pods (large clusters of TPUs) also provide a scalable solution for handling the growing complexity of large foundation models. Training can be distributed across multiple TPU devices for faster and more efficient processing.

So google doesn't use NVIDIA GPUs at all ?

dekhn 3 hours ago | parent | next [-]

When I worked there, there was a mix of training on nvidia GPUs (especially for sparse problems when TPUs weren't as capable), CPUs, and TPUs. I've been gone for a few years but I've heard a few anecdotal statements that some of their researchers have to use nvidia GPUs because the TPUs are busy.

rjh29 an hour ago | parent | prev | next [-]

I assume that's a Gemini LLM response? You can tell Gemini is bullshitting when it starts using "often" or "usually" - like in this case "TPUs often come with large amounts of memory". Either they did or they didn't. "This (particular) mall often has a Starbucks" was one I encountered recently.

PunchTornado 4 hours ago | parent | prev | next [-]

no. only tpus

paride5745 3 hours ago | parent | prev | next [-]

Another reason to use Gemini then.

Less impact on gamers…

TiredOfLife 3 hours ago | parent [-]

TPUs still use ram and chip production capacity

lejalv 3 hours ago | parent | prev [-]

Bla bla bla yada sustainability yada often come with large better growing faster...

It's such an uninformative piece of marketing crap