| ▲ | suddenlybananas 16 hours ago |
| And yet, when Chomsky says it, everyone gets very upset for some reason. |
|
| ▲ | tgv 14 hours ago | parent | next [-] |
| I never agreed with his views on syntaxis, but the (his?) idea that large parts of our language capabilities are innate is almost beyond doubt. Are people still arguing against it? |
| |
| ▲ | tim333 13 hours ago | parent | next [-] | | I think it's about the details. Chomsky argued a lot of grammar must be innate but the ability of LLMs to do grammar quite well with only a basic artificial neural network argues against that. | | |
| ▲ | tgv 11 hours ago | parent [-] | | Are you familiar with the 'poverty of stimuli' argument? The amount of language we get to process, all aural, is the tiniest of fractions of the amount of data an LLM gets to train on. And in much less processing time, too. So no, LLMs do not argue against that. | | |
| ▲ | tim333 11 hours ago | parent [-] | | I've heard of it but I'm not sure I buy it. I mean you can get examples of most grammatical constructs in a language in a few pages of text or few hours of speech. It takes a long time to go from "mama" to "I feel if I were in Chomsky's position I might have examined LLMs more" say, during which kids would be exposed to a lot of language. | | |
| ▲ | tgv 8 hours ago | parent [-] | | Small neural networks are absolutely horrible at producing syntactically valid output. BTW, English is a very simple language to get right. Even a Markov model with some depth can achieve fairly good looking English. But other languages, even from the same family, already have features which require much deeper syntactic "knowledge." So the base-line isn't "looks like an English sentence," since children can and do learn other, more complicated languages with the same ease. Show me a tabula rasa neural network that can learn those structures from the input a child gets, and you could be right. However, if you have to impose architectural constraints on the network, you'll have lost. | | |
| ▲ | tim333 4 hours ago | parent [-] | | Humans are quicker at picking up patterns than LLMs though, like I think an example if you can show a human who hasn't seen one, one picture of a dalmatian and then ask them to spot them in other images they can do it straight away whereas LLMs need many examples. Which doesn't mean we have an innate knowledge of dalmations, just that we pick up patterns quickly. |
|
|
|
| |
| ▲ | throaway123213 13 hours ago | parent | prev | next [-] | | universal grammar is probably partially correct but Chomsky's position is too wide-sweeping. Grammar just doesn't demand the kind of complexity and precision that he implies. | |
| ▲ | numpad0 12 hours ago | parent | prev [-] | | IMO the problem is that his theories are elaborate logical justifications to sugarcoat some cringe supremacy beliefs about languages and politics. The sugar has always been useful but the core is pure poison. | | |
| ▲ | tgv 11 hours ago | parent [-] | | Chomsky doesn't have any supremacist ideas about language, AFAIK. And I doubt his political views can be classified as such either. What poison do you speak of? |
|
|
|
| ▲ | lapcat 10 hours ago | parent | prev | next [-] |
| I wouldn't say that "very upset" is a correct or fair characterization for disputes in linguistics. Chomsky's universal grammar work was based on too few languages, too little data, and doesn't hold up when you look at all human languages and usage. See also Jenny Saffran's empirical work on infant statistical language learning. The broad idea that some things are innate doesn't vindicate Chomsky's specific theories. |
| |
| ▲ | suddenlybananas 8 hours ago | parent [-] | | Saying that Chomskyan linguistics 'only works on a few languages' is such a ridiculous claim that only is stated by people who haven't engaged with generative linguistics since the 1960s. There's enormous work on typologically diverse languages such as Japanese, Salish languages, Greenlandic, Basque, Gungbe or Kwa. I can provide references if you'd like. | | |
|
|
| ▲ | skeezyjefferson 14 hours ago | parent | prev | next [-] |
| [dead] |
|
| ▲ | SubiculumCode 16 hours ago | parent | prev [-] |
| He should have stopped there in his career. |