▲ | danans 2 days ago | ||||||||||||||||
> A ChatGPT prompt uses 0.3 Wh, which is approximately how much energy a Google search took in 2009. That's the number their CEO put out, but AFAIK it is completely unverified (they did not provide any background as to how it was calculated). To believe it is an article of faith at this point. What is concrete and verifiable are the large deals being struck between AI model providers and energy providers - often to be supplied via fossil fuels. | |||||||||||||||||
▲ | JimDabell 2 days ago | parent [-] | ||||||||||||||||
> That's the number their CEO put out, but AFAIK it is completely unverified (they did not provide any background as to how it was calculated). To believe it is an article of faith at this point. Google also puts the median Gemini prompt at 0.24 Wh. The information available from different sources point in the same direction; you don’t have to take Sam Altman’s word for it. 0.3 Wh was the figure that was already pretty dependable before he said that. > What is concrete and verifiable are the large deals being struck between AI model providers and energy providers - often to be supplied via fossil fuels. Which is completely irrelevant to this discussion unless you quantify that in Wh per prompt. Vague “deals are being struck!” hand-wringing doesn’t add to the discussion at all. Why are you demanding absolute, unimpeachable rigour when vendors give specific figures, but are comfortable with hand waving when it comes to complaining about energy use? | |||||||||||||||||
|