Remix.run Logo
tim333 10 hours ago

I've heard of it but I'm not sure I buy it. I mean you can get examples of most grammatical constructs in a language in a few pages of text or few hours of speech. It takes a long time to go from "mama" to "I feel if I were in Chomsky's position I might have examined LLMs more" say, during which kids would be exposed to a lot of language.

tgv 7 hours ago | parent [-]

Small neural networks are absolutely horrible at producing syntactically valid output. BTW, English is a very simple language to get right. Even a Markov model with some depth can achieve fairly good looking English. But other languages, even from the same family, already have features which require much deeper syntactic "knowledge." So the base-line isn't "looks like an English sentence," since children can and do learn other, more complicated languages with the same ease.

Show me a tabula rasa neural network that can learn those structures from the input a child gets, and you could be right. However, if you have to impose architectural constraints on the network, you'll have lost.

tim333 3 hours ago | parent [-]

Humans are quicker at picking up patterns than LLMs though, like I think an example if you can show a human who hasn't seen one, one picture of a dalmatian and then ask them to spot them in other images they can do it straight away whereas LLMs need many examples. Which doesn't mean we have an innate knowledge of dalmations, just that we pick up patterns quickly.