Remix.run Logo
ben_w 4 days ago

> At that point, companies might rediscover the ROI of good old meat based AI.

I doubt this will look good for any party.

The global electricity supply is 375 W/capita, and there's a lot of direct evidence in the form of "building new power plants" that the companies are electricity-limited. I have long observed the trends of renewable energy, but even assuming their rapid exponential growth continues, they can only roughly double this by 2032.

If we just simplify the discussion about the quality of LLMs output as "about as good as a junior graduate", then the electricity bill can increase until the price curve of {the cost of supplying that inference} matches the price curve of {the cost of hiring a junior graduate}. If the electricity price is fixed, graduates can't earn enough to feed themselves. If the graduates earn the smallest possible amount of money needed to feed and house themselves in G7 nations, then normal people are priced out of using heating/AC, the street lights get turned off because municipalities won't be able to cover the bill. If the electricity for inference becomes as expensive as hiring Silicon Valley software engineering graduates, then normal people won't even be able to keep their phones charged.

That said:

> A human brain runs on under 20 W

Only if you ignore the body it's attached to, which we cannot currently live without. And we do also need a lot of time off, as we start working at 21 and stop at just under 70 (so 5/8ths of our lives), and the working week is 40 hours out of 168, and we need more time beyond that away from paid work for sickness and reproduction, and many of us also like holidays.

Between all the capacity factors, for every hour (@20W = 20 Wh) of the average American worker's brain being on a job, there's a corresponding average of about 1 kWh used by the bodies of various Americans.