| ▲ | camillomiller 4 hours ago |
| Is that worth the cost of this technology? Both in terms of financial shenanigans and its environmental cost? |
|
| ▲ | subroutine 4 hours ago | parent | next [-] |
| Are you asking if the 10 seconds it takes AI to generate an image is more costly to the environment than a commissioned graphics artist using a laptop for 5-6 hours, or a painter who uses physical media sourced from all over the world? |
| |
| ▲ | bayindirh 3 hours ago | parent | next [-] | | In short, yes. A modern laptop is running almost fanless, like a 486 from the days of yore. A single H200 pumps out 700W continuously in a data center, and you run thousands of them. Also, don't forget the training and fine tuning runs required for the models. Mass transportation / global logistics can be very efficient and cheap. Before the pandemic, it was cheaper to import fresh tomatoes from half-world away rather than growing them locally in some cases. A single container of painting supplies is nothing in the grand scheme of things, esp. when compared with what data centers are consuming and emitting. | | |
| ▲ | ToValueFunfetti 3 hours ago | parent | next [-] | | This is a plainly dishonest comparison. A single H200 does not need to run continuously for you to generate a dozen pictures. And then you immediately pivot to comparing the paint usage against "the grand scheme of things"- 700W is nothing in the grand scheme of things. | |
| ▲ | cpill 3 hours ago | parent | prev [-] | | these are unfair comparisons. it's not just a single laptop running all day it's all the graphic designer laptops that get replaced. it's not a single container of painting supplies it's all off them, (which are toxic by the way). so if power were plentiful and environmental you'd be onboard with it? |
| |
| ▲ | dilDDoS 3 hours ago | parent | prev [-] | | Cheaper/faster tech increases overall consumption though. Without the friction of commissioning a graphics artist to design something, a user can generate thousands of images (and iterate on those images multiple times to achieve what they want), resulting in way more images overall. I'm not really well versed on the environmental cost, more just (neutrally) pointing out that comparing a single 10s image to a 5-6 hour commission ignores the fact that the majority of these images probably would never have existed in the first place without AI. | | |
| ▲ | runarberg 3 hours ago | parent [-] | | Also, ignoring training when talking about the environmental costs is bad faith. Without training this image would not exist, and if nobody generating images like these, the training would not happen. So we should really ask, the 10 seconds it took for inference, plus the weeks or months of high intensity compute it took to train the model. | | |
|
|
|
| ▲ | Legend2440 4 hours ago | parent | prev | next [-] |
| The environmental cost is significantly overblown, especially water usage. |
| |
| ▲ | bayindirh 4 hours ago | parent [-] | | I work with direct liquid cooled systems. If the datacenter is working with open DLC systems (most AI datacenters in the US in fact do), there's a lot of water is being wasted, 7/24/365. A mid-tier top-500 system (think about #250-#325) consumes about a 0.75MW of energy. AI data centers consume magnitudes more. To cool that behemoth you need to pump tons of water per minute in the inner loop. Outer loop might be slower, but it's a lot of heated water at the end of the day. To prevent water wastage, you can go closed loop (for both inner and outer loops), but you can't escape the heat you generate and pump to the atmosphere. So, the environmental cost is overblown, as in Chernobyl or fallout from a nuclear bomb is overblown. So, it's not. | | |
| ▲ | Legend2440 4 hours ago | parent [-] | | It's not that it doesn't use water; it's that water is not scarce unless you live in a desert. As a country, we use 322 billion gallons of water per day. A few million gallons for a datacenter is nothing. | | |
| ▲ | bayindirh 4 hours ago | parent | next [-] | | The problem is you don't just use that water and give it back. The water gets contaminated and heated, making it unsuitable for organisms to live in, or to be processed and used again. In short, when you pump back that water to the river, you're both poisoning and cooking the river at the same time, destroying the ecosystem at the same time too. Talk about multi-threaded destruction. | | |
| ▲ | Legend2440 3 hours ago | parent [-] | | No, you're making that up. Datacenters do not poison rivers. | | |
| ▲ | bayindirh 3 hours ago | parent [-] | | To reiterate, I work in a closed loop DLC datacenter. Pipes rust, you can't stop that. That rust seeps to the water. That's inevitable. Moreover, if moss or other stuff starts to take over your pipes, you may need to inject chemicals to your outer loop to clean them. Inner loops already use biocides and other chemicals to keep them clean. Look how nuclear power plants fight with organism contamination in their outer cooling loops where they circulate lake/river water. Same thing. |
|
| |
| ▲ | jll29 3 hours ago | parent | prev [-] | | Just because some countries waste a lot at present time does not mean it's available as a resource indefinitely. |
|
|
|
|
| ▲ | vrc 4 hours ago | parent | prev | next [-] |
| Depends on if you believe it will ever become cheaper. Either hardware, inspiring more efficient smaller models, or energy itself. The techno optimist believes that that is the inevitable and investable future. But on what horizon and will it get “zip drived” before then? |
|
| ▲ | 3dsnano 4 hours ago | parent | prev [-] |
| absolutely without a doubt it is |
| |
| ▲ | bayindirh 4 hours ago | parent [-] | | If that energy is used for research, maybe. If used to answer customer questions or generate Studio Ghibli knock-offs, it's not worth it, even a bit. | | |
| ▲ | 3dsnano 3 hours ago | parent [-] | | what’s the difference between those two? how can you say one has more value than the other? |
|
|