| ▲ | Herring 11 hours ago | |||||||
Nope, Epoch.ai thinks we have enough to scale till 2030 at least. https://epoch.ai/blog/can-ai-scaling-continue-through-2030 ^ /_\ *** | ||||||||
| ▲ | mindwok 8 hours ago | parent | next [-] | |||||||
That article is more about feasibility rather than desirability. There's even a section where they say: > Settling the question of whether companies or governments will be ready to invest upwards of tens of billions of dollars in large scale training runs is ultimately outside the scope of this article. Ilya is saying it's unlikely to be desirable, not that it isn't feasible. | ||||||||
| ▲ | 10 hours ago | parent | prev | next [-] | |||||||
| [deleted] | ||||||||
| ▲ | techblueberry 8 hours ago | parent | prev | next [-] | |||||||
Wait, nope because someone disagrees? | ||||||||
| ▲ | imiric 10 hours ago | parent | prev [-] | |||||||
That article is from August 2024. A lot has changed since then. Specifically, performance of SOTA models has been reaching a plateau on all popular benchmarks, and this has been especially evident in 2025. This is why every major model announcement shows comparisons relative to other models, but not a historical graph of performance over time. Regardless, benchmarks are far from being a reliable measurement of the capabilities of these tools, and they will continue to be reinvented and gamed, but the asymptote is showing even on their own benchmarks. We can certainly continue to throw more compute at the problem. But the point is that scaling the current generation of tech will continue to have fewer returns. To make up for this, "AI" companies are now focusing on engineering. 2025 has been the year of MCP, "agents", "skills", etc., which will continue in 2026. This is a good thing, as these tools need better engineering around them, so they can deliver actual value. But the hype train is running out of steam, and unless there is a significant breakthrough soon, I suspect that next year will be a turning point in this hype cycle. | ||||||||
| ||||||||