| ▲ | tim333 10 hours ago | |||||||
I've heard of it but I'm not sure I buy it. I mean you can get examples of most grammatical constructs in a language in a few pages of text or few hours of speech. It takes a long time to go from "mama" to "I feel if I were in Chomsky's position I might have examined LLMs more" say, during which kids would be exposed to a lot of language. | ||||||||
| ▲ | tgv 7 hours ago | parent [-] | |||||||
Small neural networks are absolutely horrible at producing syntactically valid output. BTW, English is a very simple language to get right. Even a Markov model with some depth can achieve fairly good looking English. But other languages, even from the same family, already have features which require much deeper syntactic "knowledge." So the base-line isn't "looks like an English sentence," since children can and do learn other, more complicated languages with the same ease. Show me a tabula rasa neural network that can learn those structures from the input a child gets, and you could be right. However, if you have to impose architectural constraints on the network, you'll have lost. | ||||||||
| ||||||||