Remix.run Logo
mpalmer a day ago

    It crossed some threshold that was both real and magical
Only compared to our experience at the time.

    and future improvements are relying on that basic set of features at their core
Language models are inherently limited, and it's possible - likely, IMO - that the next set of qualitative leaps in machine intelligence will come from a different set of ideas entirely.
zer00eyz a day ago | parent [-]

Learning != Training.

Thats not a period, it's a full stop. There is no debate to be had here.

IF an LLM makes some sort of breakthrough (and massive data collation allows for that to happen) it needs to be "re trained" to absorb its own new invention.

But we also have a large problem in our industry, where hardware evolved to make software more efficient. Not only is that not happening any more but we're making our software more complex and to some degree less efficient with every generation.

This is particularly problematic in the LLM space: every generation of "ML" on the llm side seems to be getting less efficient with compute. (Note: this isnt quite the case in all areas of ML, yolo models working on embedded compute is kind of amazing).

Compactness, efficiency and reproducibility are directions the industry needs to evolve in, if it ever hopes to be sustainable.