Remix.run Logo
pembrook 2 hours ago

My guess is that's off by a bit, but sure let's assume that's true.

Now measure the amount of electricity the same prompt will use in 6 years when both algorithmic efficiency and 3-4 generations of silicon lower that by 95% (or more).

Will your microwave become 95% more efficient over the next 6 years? No.

Also how many video prompts will the average person run in a given year? Almost certainly 0. I heavily use AI daily and have probably played with AI video less than 4 times, ever.

Yet certainly the average person will use 20,000-100,000 microwave minutes over their lifetime. I use my microwave for 2-3 minutes every day at lunch for example.

From first principles, the idea that electricity use = bad is wrong. If your electricity comes from burning coal or lignite, then obviously yes using that electricity has bad externalities.

But a french person running their microwave on Nuclear powered grids? This is good. Dirty energy sources is the problem. Not energy use itself.

adrr 2 hours ago | parent [-]

Are these companies going toss a $500b+ infrastructure investment away in next 6 years? Whats the average lifespan of a AI compute node?

pembrook an hour ago | parent [-]

Obviously no. AI is nowhere near as ubiquitous as the microwave so adoption is still scaling.

But as chips improve and the algorithms improve (eg. a paper just came out about getting the same results with 90% less inference using a few algorithmic techniques...on top of the fact we've already had multiple 90% efficiency jumps in AI already) the energy use per prompt will drop over time.

Meanwhile energy use per microwave minute will not meaningfully improve over time. So to make the comparison is silly.

And to pretend like the efficiency of AI will never improve given it runs on compute which by definition constantly becomes more efficient, is dumb.