▲ | visarga 3 days ago | |||||||||||||||||||||||||||||||
I did a little investigation. Turns out that GPT-4's training consumes as much energy as 300 cars in their lifetime, which comes about 50 GWh. Not really that much, could be just families on a short street burning that kind of energy. As for inference, GPT-4 usage for an hour consumes less than watching Netflix for an hour. If you compare datacenter energy usage to the rest, it amounts to 5%. Making great economies on LLMs won't save the planet. | ||||||||||||||||||||||||||||||||
▲ | lelanthran 3 days ago | parent [-] | |||||||||||||||||||||||||||||||
> As for inference, GPT-4 usage for an hour consumes less than watching Netflix for an hour. This can't be correct, I'd like to see how this was measured. Running a GPU at full throttle for one hour uses less power than serving data for one hour? I'm very sceptical. | ||||||||||||||||||||||||||||||||
|