▲ | Tycho 2 days ago | ||||||||||||||||||||||||||||||||||
What’s the energy profile of running inference in a typical ChatGPT prompt compared to:
I’d be curious. AI inference is massively centralised, so of course the data centres will be using a lot of energy, but less centralised use cases may be less power efficient from a wholistic perspective. | |||||||||||||||||||||||||||||||||||
▲ | JimDabell 2 days ago | parent | next [-] | ||||||||||||||||||||||||||||||||||
A ChatGPT prompt uses 0.3 Wh, which is approximately how much energy a Google search took in 2009. AI energy use is negligible compared with other everyday activities. This is a great article on the subject: https://andymasley.substack.com/p/a-cheat-sheet-for-conversa... The same author has published a series of articles that go into a lot of depth when it comes to AI energy and water use: | |||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||
▲ | slfnflctd 2 days ago | parent | prev [-] | ||||||||||||||||||||||||||||||||||
These are the kinds of questions we need pursued to develop better insight into the overall societal impact of current and near-future LLMs. Energy usage is a critical measure of any technology. The tradeoffs between alternate use cases should be modeled as accurately as possible, including all significant externalities. |