| ▲ | adrr 4 hours ago | ||||||||||||||||
Depends on the prompt. Do a video prompt and one 30 second video will use as much electricity as running your microwave on high for 15 minutes. | |||||||||||||||||
| ▲ | adenta an hour ago | parent | next [-] | ||||||||||||||||
source? | |||||||||||||||||
| ▲ | pembrook 2 hours ago | parent | prev [-] | ||||||||||||||||
My guess is that's off by a bit, but sure let's assume that's true. Now measure the amount of electricity the same prompt will use in 6 years when both algorithmic efficiency and 3-4 generations of silicon lower that by 95% (or more). Will your microwave become 95% more efficient over the next 6 years? No. Also how many video prompts will the average person run in a given year? Almost certainly 0. I heavily use AI daily and have probably played with AI video less than 4 times, ever. Yet certainly the average person will use 20,000-100,000 microwave minutes over their lifetime. I use my microwave for 2-3 minutes every day at lunch for example. From first principles, the idea that electricity use = bad is wrong. If your electricity comes from burning coal or lignite, then obviously yes using that electricity has bad externalities. But a french person running their microwave on Nuclear powered grids? This is good. Dirty energy sources is the problem. Not energy use itself. | |||||||||||||||||
| |||||||||||||||||