Remix.run Logo
isodev 4 days ago

> Strategic partnership enables OpenAI to build and deploy at least 10 gigawatts of AI datacenters with NVIDIA systems representing millions of GPUs

I know watts but I really can’t quantify this. How much of Nvidia is there in the amount of servers that consume 10GW? Do they all use the same chip? What if there is newer chip that consumes less, does the deal imply more servers? Did GPT write this post?

mr_toad 4 days ago | parent | next [-]

You don’t need AI to write vague waffly press releases. But to put this in perspective an H100 has a TDP of 700 watts, the newer B100s are 1000 watts I think?

Also, the idea of a newer Nvidia card using less power is très amusant.

nick__m 4 days ago | parent | prev | next [-]

A 72 GPUs NVL72 rack consumes up to 130kW, so it's a little more than 5 500 000 GPUs

az226 3 days ago | parent | prev [-]

$150-200B worth of hardware. About 2 million GPUs.

So this investment is somewhat structured like the Microsoft investment where equity was traded for Azure compute.