Remix.run Logo
ACCount37 2 hours ago

Why would language restrict LLMs?

"Language" is an input/output interface. It doesn't define the internals that produce those inputs and outputs. And between those inputs and outputs sits a massive computational process that doesn't operate on symbols or words internally.

And, what "clearer milestones" do you want exactly?

To me, LLMs crushing NLU and CSR was the milestone. It was the "oh fuck" moment, the clear signal that old bets are off and AGI timelines are now compressed.

sublinear 18 minutes ago | parent | next [-]

Language is an interface between whatever our thoughts actually are and the outside world.

Imagine trying to write apps without thinking about the limitations of the APIs you use. In fact we just recently escaped that same stupidity in the SaaS era! That's how silly LLMs will seem in the near future. They will stick around as the smarter chatbots we've wanted for so long, but they are so very far away from AGI.

AlexandrB 2 hours ago | parent | prev [-]

Language massively restricts LLMs because there's no way to create novel concepts while limited to existing language.

Humans create new words and grammatical constructs all the time in the process of building/discovering new things. This is true even in math, where new operators are created to express new operations. Are LLMs even capable of this kind of novelty?

There's also the problem that parts of human experience are inexpressible in language. A very basic example is navigating 3D space. This is not something that had to be explained to you as a baby, your brain just learned how to do it. But this problem goes deeper. For instance, intuition about the motion of objects in space. Even before Newton described gravitation every 3 year old still knew that an object that's dropped would fall to the ground a certain way. Formalizing this basic intuition using language took thousands of years of human development and spurred the creation of calculus. An AI does not have these fundamental intuitions nor any way to obtain them. Its conception of the world is only as good as the models and language (both mathematical and spoken) we have to express it.

ACCount37 39 minutes ago | parent [-]

> Its conception of the world is only as good as the models and language (both mathematical and spoken) we have to express it.

Which is pretty damn good, all things considered.

And sure, training set text doesn't contain everything - but modern AIs aren't limited to just the training set text. Even in training stage, things like multimodal inputs and RLVR have joined the fray.

I don't think "create novel concepts" is a real limitation at all. Nothing prevents an AI from inventing new notations. GPT-4o would often do that when talking to AI psychosis victims.