| ▲ | JumpCrisscross 4 days ago |
| > Remember when they told us in CS class that it's better to design more efficient algorithms than to buy a faster CPU? No? The tradeoff is entirely one between the value of labour versus the value of industry. If dev hours are cheap and CPUs expensive. If it’s the other way, which it is in AI, you buy more CPUs and GPUs. |
|
| ▲ | estimator7292 4 days ago | parent | next [-] |
| This makes sense if and only if you entirely ignore all secondary and tertiary effects of your choices. Things like massively increased energy cost, strain on the grid, depriving local citizens of resources for your datacenter, and let's not forget ewaste, pollution from higher energy use, pollution caused by manufacturing more and more chips, pollution and cost of shipping more and more chips across the planet. Yeah, it's so cheap as to be nearly free. |
| |
| ▲ | yannyu 3 days ago | parent | next [-] | | > Things like massively increased energy cost, strain on the grid This is a peculiarly USA-localized problem. For a large number of reasons, datacenters are going up all over the world now, and proportionally more of them are outside the US than has been the case historically. And a lot of these places have easier access to cheaper, cleaner power with modernized grids capable of handling it. > pollution from higher energy use Somewhat coincidentally as well, energy costs in China and the EU are projected to go down significantly over the the next 10 years due to solar and renewables, where it's not so clear that's going to happen in the US. As for the rest of the arguments around chip manufacturing and shipping and everything else, well, what do you expect? That we would just stop making chips? We only stopped using horses for transportation when we invented cars. I don't yet see what's going to replace our need for computing. | | |
| ▲ | zekrioca 3 days ago | parent [-] | | Almost everything you wrote is incorrect, which is why you don’t provide sources for anything. And in the end, the cherry: “yes, the world is ending, so what can we do? I guess nothing, let’s just continue burning it so it dies faster.” |
| |
| ▲ | JumpCrisscross 3 days ago | parent | prev | next [-] | | > it's so cheap as to be nearly free Both chips and developer time are expensive. Massively so, both in direct cost and secondary and tertiary elements. (If you think hiring more developers to optimise code has no knock-on effects, I have a bridge to sell you.) There isn't an iron law about developer time being less valuable than chips. When chip progress stagnates, we tend towards optimising. When the developer pipeline is constrained, e.g. when a new frontier opens, we tend towards favouring exploration over optimisation. If a CS programme is teaching someone to always try to optimise an algorithm versus consider whether hardware might be the limitation, it's not a very good one. In this case, when it comes to AI, there is massive investment going into trying to find more efficient training and inference algorithms. Research which, ironically enough, generally requires access to energy. | |
| ▲ | codingrightnow 3 days ago | parent | prev [-] | | [flagged] |
|
|
| ▲ | utyop22 3 days ago | parent | prev | next [-] |
| "Which it is in AI, you buy more CPUs and GPUs." Ermmm. what? |
|
| ▲ | rhdhfjej 4 days ago | parent | prev [-] |
| [flagged] |
| |
| ▲ | tootie 3 days ago | parent | next [-] | | It's the difference between computer science and software engineering. | |
| ▲ | logicchains 4 days ago | parent | prev [-] | | And you would have been mocked by your peers for being so concieted that you'd dare to look down on other people for not inventing an algorithm that doesn't exist and for which there's no evidence it's even possible for one to exist. | | |
|