| ▲ | tgv 13 hours ago |
| I never agreed with his views on syntaxis, but the (his?) idea that large parts of our language capabilities are innate is almost beyond doubt. Are people still arguing against it? |
|
| ▲ | tim333 12 hours ago | parent | next [-] |
| I think it's about the details. Chomsky argued a lot of grammar must be innate but the ability of LLMs to do grammar quite well with only a basic artificial neural network argues against that. |
| |
| ▲ | tgv 11 hours ago | parent [-] | | Are you familiar with the 'poverty of stimuli' argument? The amount of language we get to process, all aural, is the tiniest of fractions of the amount of data an LLM gets to train on. And in much less processing time, too. So no, LLMs do not argue against that. | | |
| ▲ | tim333 10 hours ago | parent [-] | | I've heard of it but I'm not sure I buy it. I mean you can get examples of most grammatical constructs in a language in a few pages of text or few hours of speech. It takes a long time to go from "mama" to "I feel if I were in Chomsky's position I might have examined LLMs more" say, during which kids would be exposed to a lot of language. | | |
| ▲ | tgv 7 hours ago | parent [-] | | Small neural networks are absolutely horrible at producing syntactically valid output. BTW, English is a very simple language to get right. Even a Markov model with some depth can achieve fairly good looking English. But other languages, even from the same family, already have features which require much deeper syntactic "knowledge." So the base-line isn't "looks like an English sentence," since children can and do learn other, more complicated languages with the same ease. Show me a tabula rasa neural network that can learn those structures from the input a child gets, and you could be right. However, if you have to impose architectural constraints on the network, you'll have lost. | | |
| ▲ | tim333 3 hours ago | parent [-] | | Humans are quicker at picking up patterns than LLMs though, like I think an example if you can show a human who hasn't seen one, one picture of a dalmatian and then ask them to spot them in other images they can do it straight away whereas LLMs need many examples. Which doesn't mean we have an innate knowledge of dalmations, just that we pick up patterns quickly. |
|
|
|
|
|
| ▲ | throaway123213 12 hours ago | parent | prev | next [-] |
| universal grammar is probably partially correct but Chomsky's position is too wide-sweeping. Grammar just doesn't demand the kind of complexity and precision that he implies. |
|
| ▲ | numpad0 11 hours ago | parent | prev [-] |
| IMO the problem is that his theories are elaborate logical justifications to sugarcoat some cringe supremacy beliefs about languages and politics. The sugar has always been useful but the core is pure poison. |
| |
| ▲ | tgv 11 hours ago | parent [-] | | Chomsky doesn't have any supremacist ideas about language, AFAIK. And I doubt his political views can be classified as such either. What poison do you speak of? |
|