| ▲ | dr_dshiv 4 hours ago | |||||||
One hour of Claude code— well, I’d guess it would be comparable to an hour of driving an electric car. How to know? | ||||||||
| ▲ | MichaelDickens 4 hours ago | parent | next [-] | |||||||
OP says one query uses 0.3 Wh. Driving an electric car for 10 miles = 3,000 Wh which is roughly 10,000 Wh per hour. I'm not sure how many queries is equivalent to an hour of Claude code use, but maybe 5 seconds, which means an hour of continuous use = 216 Wh, or ~50x less than an electric car. OP has a longer article about LLM energy usage: https://hannahritchie.substack.com/p/ai-footprint-august-202... | ||||||||
| ||||||||
| ▲ | rsolva 4 hours ago | parent | prev [-] | |||||||
It is not only about raw power consumption. Comparing driving an electric car with using AI only in kW hides a major point: Hyperscale datacenters are massively centralised, which brings it's own problems; a lot of energy is used for cooling, and water consumptions is enormous. Charging electric cars at home is distributed and does not suffer from the same problems as the centralised hyperscalers do. Also, running AI models at home is not much different than a gaming session :) | ||||||||
| ||||||||