| ▲ | getnormality 2 hours ago | |||||||||||||||||||||||||||||||
You're presupposing an answer to what is actually the most interesting question in AI right now: does scaling continue at a sufficiently favorable rate, and if so, how? The AI companies and their frontier models have already ingested the whole internet and reoriented economic growth around data center construction. Meanwhile, Google throttles my own Gemini Pro usage with increasingly tight constraints. The big firms are feeling the pain on the compute side. Substantial improvements must now come from algorithmic efficiency, which is bottlenecked mostly by human ingenuity. AI-assisted coding will help somewhat, but only with the drudgery, not the hardest parts. If we ask a frontier AI researcher how they do algorithmic innovation, I am quite sure the answer will not be "the AI does it for me." | ||||||||||||||||||||||||||||||||
| ▲ | asdff 2 hours ago | parent [-] | |||||||||||||||||||||||||||||||
Of course it continues. Look at the investment in hardware going on. Even with no algorithmic efficiency improvement that is just going to force power out of the equation just like a massive inefficient V8 engine with paltry horsepower per liter figures. | ||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||