| ▲ | palata a day ago |
| > But we have been here before. Predictions of this kind have been made ever since the emergence of the Internet I don't think I live in the same world as the author. Ever since the emergence of the Internet, "stuff related to IT" has been using more and more energy. It's like saying "5G won't use as much electricity as we are told! In fact 5G is more efficient than 4G". Yep, except that 5G enables us to use a lot more of it, and therefore we use more electricity. It's called the rebound effect. |
|
| ▲ | onlyrealcuzzo a day ago | parent | next [-] |
| If you're using more of it, because it's replacing corporate travel and going into the office and driving across town to see your friends and family and facetiming instead, then you are still MASSIVELY reducing your total energy. It's not like the majority of electricity use by computers is complete waste. You can poo-hoo and say I don't want to live in the digital world, and want to spend more time flying around the world to work with people in person or actually see my mom, or buy physical paper in stores that's shipped there and write physical words on it and have the USPS physically ship it, but that's just wildly, almost unfathomably, less efficient. If Google didn't exist, who knows how many more books I'd need to own, how much time I'd spend buying those books, how much energy I'd spend going to the stores to pick them up, or having them shipped. It's almost certainly a lot less than how much energy I spend using Google. While we all like to think that Facebook is a complete waste of time, what would you be spending your time doing otherwise? Probably something that requires more energy than close to nothing looking at memes on your phone. Not to mention, presumably, at least some people are getting some value from even the most wasteful pits of the Internet. Not everything is Bitcoin. |
| |
| ▲ | palata a day ago | parent | next [-] | | You also seem to live in a different world. I urge you to start getting informed on what needs to be done in order to build hardware (hint: it does not grow on trees). > then you are still MASSIVELY reducing your total energy. Instead of using all those caps, look at the numbers: we have them. We use more and more energy. > but that's just wildly, almost unfathomably, less efficient. Not sure if you really need the hint, but you shouldn't spend more time flying around the world. > It's almost certainly a lot less than how much energy I spend using Google. It is a fact that it isn't. Before Google, people were using less energy than we are now, period. > Probably something that requires more energy than close to nothing looking at memes on your phone. The industry that gets you your memes on the hardware you call phone is anything but "close to nothing" when it comes to energy. I would say that you are in bad faith, but with all those examples you've giving, it seems like you are just uninformed. So let me be blunt: your kids will most likely die because of how much energy we use (from one of the plethora of problems coming from that). At this point, we cannot do much about it, but the very least would be to be aware of it. | | |
| ▲ | onlyrealcuzzo a day ago | parent [-] | | More energy because more people and more things are electrified & mechanical. A massive portion of the world was basically living in the stone age and has been lifted into middle class lives over the last 60 years. The population has also more than doubled. This is like comparing apples to apes. Sure, if you go back to when we were all monkeys, we are obviously using more energy per capita. If you go back to WW2, The West is using far less energy per capita, even when you account for imports. And again, that's far less energy to produce far better lives. And both of those tr ends are continuing every year. Sorry, you can't say, globally we use more energy, so every usage of energy is causing us to use more energy. It's not that simple. | | |
| ▲ | palata a day ago | parent [-] | | > Sure, if you go back to when we were all monkeys, we are obviously using more energy per capita. We keep using more and more energy per capita, period. You can go back 10 000 years, 200 years or 100 years, it's the same. > If you go back to WW2, The West is using far less energy per capita, even when you account for imports. This is blatantly wrong. > Sorry, you can't say, globally we use more energy, so every usage of energy is causing us to use more energy. It's not that simple. It is that simple: what you wrote is called a tautology: we use more energy, so we use more energy. And every new usage of energy is causing us to use more energy. If you use more, you use more. How is that not simple? :-) | | |
| ▲ | onlyrealcuzzo a day ago | parent [-] | | https://datacommons.org/tools/visualization#visType%3Dtimeli... | | |
| ▲ | palata a day ago | parent [-] | | Ok, you show me a line. Where does it explain what it measures? It says "Energy used per capita", it doesn't say what energy it accounts for. It most definitely does NOT account for the commute of the employees in China who worked on parts of your smartphone. Does it account for the use of TikTok? To what extent? Does it account for the AC in the datacenters used by TikTok outside the US? |
|
|
|
| |
| ▲ | wahnfrieden a day ago | parent | prev [-] | | How do you account for overall energy use being up massively, and rising at record breaking pace | | |
| ▲ | timschmidt a day ago | parent | next [-] | | According to the following references, most residential energy is used for heating and cooling. Most commercial energy is used for lighting, heating, and cooling. And most industrial energy is used in chemical production, petroleum and coal products, and paper production. 1: https://www1.eere.energy.gov/buildings/publications/pdfs/cor... 2: https://www.eia.gov/energyexplained/use-of-energy/industry.p... | |
| ▲ | rtuulik a day ago | parent | prev | next [-] | | Its not. For the US, energy use per capita has been trending downwards since 1979. For the developing worlds, increase in energy usage is tied to increasing living standards. | | |
| ▲ | palata a day ago | parent [-] | | > For the US, energy use per capita has been trending downwards since 1979 It would be relevant if the US was completely isolated from the rest of the world. But guess what? The hardware you used to write this comment does not come from the US. Not taking into account the energy that went into building and transporting your hardware where you are currently using is... well wrong. |
| |
| ▲ | onlyrealcuzzo a day ago | parent | prev [-] | | > How do you account for overall energy use being up massively, and rising at record breaking pace That has nothing to do with how much energy is spent on Google and the Internet vs how many more people there are, and how much more stuff the average person in developing economies has. |
|
|
|
| ▲ | Arnt a day ago | parent | prev | next [-] |
| Nothing forces the rebound effect to dominate. Computers grow cheaper, we rebound by buying ones with higher capacity, but the overall price still shrinks. I bet the computer you used to post today cost much less than Colossus. Similarly, nothing forces AI or 5G to use more power than whatever you would have done instead. You can stream films via 5G that you might not have done via 4G, but you might've streamed via WLAN or perhaps cat5 cable instead. The rebound effect doesn't force 5G to use more power than WLAN/GBE. Or more power than driving to a cinema, if you want to compare really widely. The film you stream makes it comparable, not? |
| |
| ▲ | everdrive a day ago | parent | next [-] | | >Nothing forces the rebound effect to dominate. Human nature does. We're like a gas, and we fill to expand the space we're in. If technology uses less power, in general, we'll just use more of it until we hit whatever natural limits are present. (usually cost, or availability) I'm not sure I'm a proponent of usage taxes, but they definitely have the right idea; people will just keep doing more things until it becomes too expensive or they are otherwise restricted. The problem you run into is how the public reacts when "they" are trying to force a bunch of limitations on you that you didn't previously need to live with. It's politically impossible, even in a case where it's the right choice. | | |
| ▲ | Arnt a day ago | parent [-] | | I don't understand why "we're like a gas, and expand to fill the space we're in". What makes the simile apply to e.g. AI or 5G when it doesn't apply to others, e.g. computer prices? | | |
| ▲ | everdrive a day ago | parent [-] | | I think the Apple ARM chips are a good example. They're fantastically more efficient and fantastically powerful. We _could_ take this incredible platform and say "we can probably do personal computing on 3-5 watts if we really focus on efficiency." But we don't do that. With more powerful chips, websites and apps and operating systems will get less efficient, bigger, more bloated. If there's any slack in the system we'll just take it up with crap. The chips will be faster next year so why bother making things more efficient? Repeat this process forever, and we eat up all of our efficiency gains. | | |
| ▲ | Arnt a day ago | parent | next [-] | | My Apple ARM laptop has 24+ hours battery lifetime in practice, three times as much as the best laptop I had in the 2000-2015 period, and it's lighter than most of those laptops too. (Can't remember the battery lifetime of my 2015-2019 laptop.) Clearly not all efficiency gains have been eaten up. | | |
| ▲ | everdrive a day ago | parent [-] | | Agreed, it's not a perfect linear line, but if we how 2000-2015 computing requirements _and_ the Apple ARM efficiency we could be somewhere very special. |
| |
| ▲ | vel0city a day ago | parent | prev [-] | | A lot of households did make the change from having a few hundred watt PC with a hundred watt monitor to a couple of phones and maybe a tablet that don't use anywhere near as much energy. They use those devices all day, but overall they use less power than a few hours a week of an older desktop. |
|
|
| |
| ▲ | bilekas a day ago | parent | prev | next [-] | | > Similarly, nothing forces AI or 5G to use more power than whatever you would have done instead Am I missing something or has the need to vast GPU horsepower been solved ? Those requirements were not in DC's before and they're only going up. Whatever way you look at it, there's got to be an increase in power consumption somewhere no ? | | |
| ▲ | Arnt a day ago | parent | next [-] | | Not necessarily, no. You can pick and choose your comparisons, and make an incease appear or not. Take weather forecasts as an example. Weather forecasting uses massively powerful computers today. If you compare that forecasting with the lack of forecasts two hundred years ago there obviously is an increase in power usage (no electricity was used then) or there obviously isn't (today's result is something we didn't have then, so it would be an apples-to-nothing comparison). If you say "the GPUs are using power now that they weren't using before" you're implicitly doing the former kind of comparison. Which is obviously correct or obviously wrong ;) | |
| ▲ | timschmidt a day ago | parent | prev [-] | | GPU compute in datacenters has been a thing for at least 20 years. Many of the top500 have included significant GPU clusters for that long. There's nothing computationally special about AI compared to other workloads, and in fact it seems to lend itself to multiplexing quite efficiently - it's possible to process thousands of prompts for a negligable memory bandwidth increase over a single prompt. AI is still very near the beginning of the optimization process. We're still using (relatively) general purpose processors to run it. Dedicated accelerators are beginning to appear. Many software optimizations will be found. FPGAs and ASICs will be designed and fabbed. Process nodes will continue to shrink. Moore will continue to exponentially decrease costs over time as with all other workloads. | | |
| ▲ | philipwhiuk a day ago | parent | next [-] | | > Moore will continue to exponentially decrease costs over time as with all other workloads. There's absolutely no guarantee of this. The continuation of Moore's law is far from certain (NVIDIA think it's dead already). | | |
| ▲ | timschmidt a day ago | parent [-] | | > NVIDIA think it's dead already Perhaps that's what Jensen says publicly, but Nvidia's next generation chip contains more transistors than the last. And the one after that will too. Let me know when they align their $Trillions behind smaller less complex designs, then I'll believe that they think Moore's law is out of juice. Until then, they can sit with the group of people who've been vocally wrong about moore's law's end for the last 50 years. Our chips are still overwhelmingly 2D in design, just a few dozen layers thick but billions of transistors wide. We have quite a ways to go based on a first principles analysis alone. And indeed, that's what chip engineers like Jim Keller say: https://www.youtube.com/watch?v=c01BlUDIlK4 So ask yourself how it benefits Jensen to convince you otherwise. | | |
| ▲ | adgjlsfhk1 a day ago | parent [-] | | progress continues, but at far slower rates than they used to. nvidia has gained ~6x density in the past 9 years (1080 to 5090), while a doubling every 2 years would be >20x density in 9 years. the past 6 years (3090) are even worse with only a 3x of density | | |
| ▲ | timschmidt a day ago | parent [-] | | Moore's law says nothing about density. "The complexity for minimum component costs has increased at a rate of roughly a factor of two per year." Density is one way in which industry has met this observation over decades. New processes (NMOS, CMOS, etc) is another. New packaging techniques (flip chip, BGA, etc). New substrates. There's no limit to process innovation. Nvidia's also optimizing their designs for things other than minimum component cost. I.e. higher clock speeds, lower temperatures, lower power consumption, etc. It may seem like I'm picking a nit here, but such compromises are fundamental to the cost efficiency Moore was referencing. All data I've seen, once fully considered, indicates that Moore's law is healthy and thriving. |
|
|
| |
| ▲ | bilekas a day ago | parent | prev [-] | | > GPU compute in datacenters has been a thing for at least 20 years. Many of the top500 have included significant GPU clusters for that long. Of course they've been a thing, but for specialised situations, maybe rendering farms or backroom mining centers but it's disingenuous to claim that there's not an exponential growth in gpu useage. | | |
| ▲ | timschmidt a day ago | parent [-] | | Of course they've been a thing, but for specialized situations, maybe calculating trajectories or breaking codes but it's disingenuous to claim that there's not an exponential growth in digital computer usage. Jest aside, the use of digital computation has exploded exponentially, for sure. But alongside that explosion, fueled by it and fueling it reciprocally, the cost (in energy and dollars) of each computation has plummeted exponentially. | | |
| ▲ | bilekas a day ago | parent [-] | | I really would like more of your data to show that, I think it would put this discussion to rest actually because I keep seeing articles that dispute it. At least older ones that ring bells, specifically https://epoch.ai/blog/trends-in-the-dollar-training-cost-of-... | | |
| ▲ | timschmidt a day ago | parent [-] | | You can find plenty of jumping off points for research here: https://en.wikipedia.org/wiki/Performance_per_watt Along with this lovely graph captioned: "Exponential growth of supercomputer performance per watt based on data from the Green500 list." (note the log scale): https://en.wikipedia.org/wiki/Performance_per_watt#/media/Fi... From the section about GPU performance per watt, I'll quote: "With modern GPUs, energy usage is an important constraint on the maximum computational capabilities that can be achieved. GPU designs are usually highly scalable, allowing the manufacturer to put multiple chips on the same video card, or to use multiple video cards that work in parallel. Peak performance of any system is essentially limited by the amount of power it can draw and the amount of heat it can dissipate. Consequently, performance per watt of a GPU design translates directly into peak performance of a system that uses that design." |
|
|
|
|
| |
| ▲ | palata a day ago | parent | prev | next [-] | | > Nothing forces the rebound effect to dominate. Not sure what to say to that. Yeah, it would be great if we didn't put so much resources into destroying our own world. I agree. The fact is that rebound effect very much dominates everything we do. I'm not saying it should, I'm saying it does. It's an observation. | | |
| ▲ | Arnt a day ago | parent [-] | | I think you're telling me that graphs like these always point upwards, right? https://ourworldindata.org/grapher/per-capita-energy-use?tab... My own impression is that sometimes the aggregate total grows and sometimes it doesn't. And when it grows, sometimes that's because the rebound effect dominates. | | |
| ▲ | palata a day ago | parent [-] | | Read the part in your link that says "Here, energy refers to primary energy". Primary energy in the US ignores the primary energy used in China for goods that end up being imported in the US. |
|
| |
| ▲ | Analemma_ a day ago | parent | prev [-] | | There is some limit to the rebound effect because people only have so many hours in the day, but we’re nowhere near the ceiling of how much AI compute people could use. Note how many people pay for the $200/month plans from Anthropic, OAI etc. and still hit limits because they constantly spend $8000 worth of tokens letting the agents burn and churn. It’s pretty obvious that as compute gets cheaper via hardware improvements and power buildout, usage is going to climb exponentially as people go “eh, let the agent just run on autopilot, who cares if it takes 2MM tokens to do [simple task]”. I think for the foreseeable future we should consider the rebound effect in this sector to be in full force and not expect any decreases in power usage for a long time. |
|
|
| ▲ | taeric a day ago | parent | prev | next [-] |
| Do we use more electricity because of 5G? I confess I'd assume modern phones and repeater networks use less power than older ones. Even at large. I can easily agree that phones that have internet capabilities use more, as a whole, than those that didn't. The infrastructure needs were very different. But, especially if you are comparing to 4G technology, much of that infrastructure already had to distribute content that was driving the extra use. I would think this would be like cars. If you had taken the estimates of how much pollution vehicles did 40 years ago and assume that that was going to be constant even as the number of cars went up, you'd probably assume we are living in the worst air imaginable. Instead, even gas cars got far better as time went on. Doesn't mean the problem went away, of course. And some sources of the polution, like tires, did get worse as total makeup as we scaled up. Hopefully we can find ways to make that better, as well. |
| |
| ▲ | palata a day ago | parent | next [-] | | > Do we use more electricity because of 5G? I confess I'd assume modern phones and repeater networks use less power than older ones. Even at large. If we did exactly the same with 5G than what we did with 4G, it would be more efficient. But why do we develop 5G? Because we can do more. It is more efficient, but we do much more, so we increase our energy consumption. This is called the "rebound effect". It's observed for every. single. technology. | |
| ▲ | aceazzameen a day ago | parent | prev | next [-] | | As a data point, I turn 5G off on my phone and get several hours more battery life using 4G. I'm pretty sure the higher bandwidth is consuming more energy, especially since 5G works at shorter distances and probably needs the power to stay connected to cell towers. | | | |
| ▲ | ElevenLathe a day ago | parent | prev [-] | | The phones, towers, and networks are only the tip of the power iceberg. How much electricity are we burning to run the servers to service the requests that all these 5G phones can now make because of all the wonderfully cheap wireless connectivity? |
|
|
| ▲ | Majestic121 a day ago | parent | prev | next [-] |
| This is countered in the article. "Yet throughout this period, the actual share of electricity use accounted for by the IT sector has hovered between 1 and 2 per cent, accounting for less than 1 per cent of global greenhouse gas emissions." |
| |
| ▲ | prewett a day ago | parent | next [-] | | But presumably the total use of energy has been going up, so while the relative percentage might be the same, I doubt very much that the absolute quantity of GWhr/year has stayed the same. | |
| ▲ | palata a day ago | parent | prev [-] | | It's bad faith to talk about greenhouse gas emissions without taking into account the indirect contributions. When you use a computer, you cannot only account for the electricity that goes into the computer that is sitting on your desk. You have to account for all the energy that went into extracting the materials from the ground, building the electronics, shipping it accross the world, and then the electricity to operate it. |
|
|
| ▲ | bicepjai a day ago | parent | prev | next [-] |
| Sounds similar to Jevons Paradox |
|
| ▲ | a day ago | parent | prev [-] |
| [deleted] |