▲ | loudmax 2 days ago | |
If the diffusion models are an improvement over autoregression models, then the answer is No, due to Jevon's paradox. That is, as these models get cheaper and better, they provide more utility, driving more demand. Even as your datacenters become more productive, the demand for their compute power increases at an even faster pace. The thing that will limit demand for compute is when the world decides it has sufficient capacity of the form of "intelligence" these models provide. I don't think anyone has any idea what that world will even look like. | ||
▲ | jgalt212 2 days ago | parent [-] | |
Perhaps, but it's not clear that Jevon's paradox is at play here. AI / LLM uptake has been muted, or lacking legs, outside of coding. And it's not because of cost (AI inference is being provided below cost by megatech). |