| ▲ | jampekka 5 hours ago | |
LLM inference uses on the order of 1 Wh per query. That's under 10 meters of driving on an EV or running air conditioning for under 5 seconds. https://hannahritchie.substack.com/p/ai-footprint-august-202... | ||
| ▲ | bluefirebrand 3 hours ago | parent | next [-] | |
One query is not going to be a useful benchmark when people are deploying AI swarms in loops to solve simple problems | ||
| ▲ | deadbabe 4 hours ago | parent | prev [-] | |
Or a human riding a stationary bike for 36 seconds. | ||