Remix.run Logo
airstrike 9 hours ago

> One could say that the introduction of the personal computer became a "race to the bottom." But it was only the start of the dot-com bubble era, a bubble that brought about a lot of beneficial market expansion.

I think the comparison is only half valid since personal computers were really just a continuation of the innovation that was general purpose computing.

I don't think LLMs have quite as much mileage to offer, so to continue growing, "AI" will need at least a couple step changes in architecture and compute.

zozbot234 9 hours ago | parent [-]

I don't think anyone knows for sure how much mileage/scalability LLMs have. Given what we do know, I suspect if you can afford to spend more compute on even longer training runs, you can still get much better results compared to SOTA, even for "simple" domains like text/language.

airstrike 8 hours ago | parent [-]

I think we're pretty much out of "spend more compute on even longer training runs" atp.