Remix.run Logo
studmuffin650 a day ago

Also important to remember that Google is years ahead of most other AI shops in that they're running on custom silicon. This makes their inference (and maybe training) cheaper then almost any other company. People don't realize this when compared to OpenAI/Anthropic where most folks are utilizing NVIDIA GPUs, Google is completely different in that aspect with their custom TPU platform.

xnx a day ago | parent | next [-]

> Also important to remember that Google is years ahead of most other AI shops in that they're running on custom silicon.

Not just the chips, Google's entire datacenter setup seems much more mature (e.g. liquid cooling, networking, etc.). I saw some video of new Amazon datacenter (https://www.youtube.com/watch?v=vnGC4YS36gU) and it looks like a bunch of server racks in a warehouse.

stego-tech a day ago | parent [-]

Google’s datacenters are excellent, from what I’ve seen in my career. They genuinely had so many amazingly talented SMEs pushing boundaries for decades without executive intervention or deterrence, and that’s paid dividends in the subsequent tenure under Pichai and external shareholders (in that they have “infinite” runway and cash reserves to squander on moonshots before risking the company’s core businesses). That said, nothing lasts forever, and if their foray into LLMs don’t pay off, their shareholders are going to be pissed.

lokar a day ago | parent [-]

And not just pushing the boundaries, working with the HW vendors to define them, asking for features and design elements that others don't really even see the point of.

cma a day ago | parent | prev [-]

Anthropic uses TPUs as well as nvidia. Compiler bugs in the tooling around the platform caused most of their quality issues and customer churn this year, but I think they've since announced a big expansion in use:

https://www.anthropic.com/engineering/a-postmortem-of-three-...