Remix.run Logo
sharemywin 2 days ago

People assume training on past data means no novelty, but novelty comes from recombination. No one has written your exact function, with your exact inputs, constraints, and edge cases, yet an LLM can generate a working version from a prompt. That’s new output. The real limitation isn’t novelty, it’s grounding.

gtowey a day ago | parent | next [-]

Synthesis is different than genesis.

Just because there are lots of tasks which can be accomplished without the need for anything novel doesn't mean LLMs can match a human. It just means humans spend a lot of time doing some really boring stuff.

deadbabe 2 days ago | parent | prev [-]

That’s only part of it. The true intelligence is knowing what to recombine, and this is where AI falls short.