| ▲ | 1970-01-01 5 hours ago | |||||||||||||
Let's not gloss over the electrical supply. These chips won't work for free. | ||||||||||||||
| ▲ | jampekka 5 hours ago | parent [-] | |||||||||||||
LLM inference uses on the order of 1 Wh per query. That's under 10 meters of driving on an EV or running air conditioning for under 5 seconds. https://hannahritchie.substack.com/p/ai-footprint-august-202... | ||||||||||||||
| ||||||||||||||