Remix.run Logo
jeffbee a day ago

This is a good way to contextualize the energy and carbon intensity of AI training. Every single time you fly a plane like this across a continent or ocean, you use energy comparable to a large model training run.

hedora a day ago | parent [-]

Source? Large model training runs cost more than flying a plane across the atlantic, so this doesn’t sound right.

jjk166 a day ago | parent [-]

Yeah, it's way off. GPT-4 required the energy of about 1.3 million gallons of jet fuel; a fully fueled A321 has about 9000 gallons of jet fuel. That's 2 orders of magnitude off. Even a GPT-3 training run would have been about 4 times as energy intensive as an A321 flight.

jeffbee a day ago | parent [-]

Flying an A321neo JFK to LHR emits over 60 tons of CO2, which is 50% more than was emitted when training GLaM.

hedora 10 hours ago | parent [-]

Those CO2 numbers probably aren’t trustworthy.

If the training run hadn’t happened, would the renewable/nuclear plants with lower marginal costs have curtailed production before the carbon intensive plants that have higher marginal costs? That doesn’t make any economic sense. Instead, carbon intensive power made up for the shortfall in production created by the run.

If the companies that ran the training can also show me a 6000 ton brick of carbon they pulled out of the atmosphere, or equivalent early-decommissioned natural gas / coal boilers, then I’ll stand corrected.