Remix.run Logo
timschmidt a day ago

Of course they've been a thing, but for specialized situations, maybe calculating trajectories or breaking codes but it's disingenuous to claim that there's not an exponential growth in digital computer usage.

Jest aside, the use of digital computation has exploded exponentially, for sure. But alongside that explosion, fueled by it and fueling it reciprocally, the cost (in energy and dollars) of each computation has plummeted exponentially.

bilekas a day ago | parent [-]

I really would like more of your data to show that, I think it would put this discussion to rest actually because I keep seeing articles that dispute it. At least older ones that ring bells, specifically https://epoch.ai/blog/trends-in-the-dollar-training-cost-of-...

timschmidt a day ago | parent [-]

You can find plenty of jumping off points for research here: https://en.wikipedia.org/wiki/Performance_per_watt

Along with this lovely graph captioned: "Exponential growth of supercomputer performance per watt based on data from the Green500 list." (note the log scale): https://en.wikipedia.org/wiki/Performance_per_watt#/media/Fi...

From the section about GPU performance per watt, I'll quote:

"With modern GPUs, energy usage is an important constraint on the maximum computational capabilities that can be achieved. GPU designs are usually highly scalable, allowing the manufacturer to put multiple chips on the same video card, or to use multiple video cards that work in parallel. Peak performance of any system is essentially limited by the amount of power it can draw and the amount of heat it can dissipate. Consequently, performance per watt of a GPU design translates directly into peak performance of a system that uses that design."