Remix.run Logo
somenameforme 3 days ago

Their output is in natural language, that's about the end of similarities with humans. They're token prediction algorithms, nothing more and nothing less. This can achieve some absolutely remarkable output, probably because our languages (both formal and linguistic) are absurdly redundant. But the next token being a word, instead of e.g. a ticker price, doesn't suddenly make them more like humans than computers.

nisegami 3 days ago | parent [-]

I see this "next token predictor" description being used as a justification for drawing a distinction between LLMs and human intelligence. While I agree with that description of LLMs, I think the concept of "next token predictor" is much, much closer to describing human intelligence than most people consider.

somenameforme 3 days ago | parent [-]

Humans invented language, from nothing. For that matter we went from a collective knowledge not far beyond 'stab them with the pokey end' to putting a man on the Moon. And we did it the blink of an eye if you consider how inefficient we are at retaining and conferring knowledge over time. Have an LLM start from the same basis humanity did and it will never produce anything, because the next token to get from [nothing] to [man on the Moon] simply does not exist for an LLM until we add it to its training base.