▲ | const_cast a day ago | |
To be completely fair, AI really does use more water than other typical compute tasks, because AI takes A LOT of compute. No, it's not like email, or a web server. I can run an email server or apache on my rinky dink computer and get hundreds of requests per second. I can't run chatgpt, that requires a super computer. And of the stuff I can run, like deepseek, I'm getting very few tokens/s. Not requests! Tokens! Yes, inference has an energy cost that is significantly more than other compute tasks. | ||
▲ | Veedrac a day ago | parent [-] | |
The energy use claims are questionable, but I at least get where they're coming from. The water use is the confusing part. Who looks at a server rack and goes ‘darn, look at how water intensive this is’? People use water as a coolant in large part because it's really hard to boil, plus it's typically cheap because it regularly gets delivered to your front door for free. As to actual numbers, they're not that hard to crunch, but we have a few good sources that have done so for us. Simple first-principles estimate: https://epoch.ai/gradient-updates/how-much-energy-does-chatg... Google report: https://arxiv.org/abs/2508.15734 Altman claim inside a blog post: https://blog.samaltman.com/the-gentle-singularity |