| ▲ | billisonline 2 days ago | |
"Tons of power generation?" Perhaps we will go in that direction (as OpenAI projects), but it assumes the juice will be worth the squeeze, i.e., that scaling laws requiring much more power for LLM training and/or inference will deliver a qualitatively better product before they run out. The failure of GPT 4.5, while not a definitive end to scaling, was a pretty discouraging sign. | ||