Remix.run Logo
cess11 3 hours ago

I'm not sure what "exponential improvement" would mean in this context, but large models have been a massively hyped and invested thing for what, three-four years or so, right?

And what do they run on? Information. The production of which is throttled by the technology itself, in part because the salespeople claim it can (and should) "replace" workers and thinkers, in part because many people have really low standards for entertainment and accept so called slop instead of cheap tropes manually stitched together.

So it would seem unlikely that they'll get the required information fed into them that would be needed for them to outpace the public internet och and widely pirated books and so on.

fragmede 2 hours ago | parent [-]

The counterpoint to this is that information is sealed up in bottles that previously haven't been worth it to be unsealed. How much can you charge for the source code to a program writing in Zig to calculate the fibonacci sequence? That's worth approximately zero. But generating a million tested programs, with source code, that have been run through a compiler and test suite, to be used as "information" suddenly becomes worth it to the AI labs to buy up, if not generate for themselves. So imo there's still ways to go, even if the human Internet isn't growing as much post-AI as in all the years before it.