| ▲ | LorenPechtel 8 hours ago | |
He's got it backwards. "Language is a serial, low-bandwidth channel. It transmits one proposition at a time, sequentially. Each proposition can relate a small number of variables: “if X and Y, then Z.” Complex conditionals can extend this to perhaps five or six variables before the sentence becomes unparseable: “if X and Y but not Z, unless W and V, then Q.” ... "This is not a claim about human cognitive limitations. It is an information-theoretic claim about the channel capacity of natural language relative to the complexity of the models being transmitted." No. This definitely is about human cognitive limitations. Consider *why* that 6-variable rule is unparseable: Human working memory is typically about 7 items. His unparseable example has by a generous accounting 10 entities--6 variables and 4 relationships. (And by a strict accounting you need to count the "then", for 11 entities.) Very, very few humans could learn that. Knowledge must be chopped up into chunks small enough to fit into our working memory to be understood and incorporated into our models. Once one chunk has been incorporated into our model we can then add another chunk to it, repeating until we have built up a complex model. And it's not just read, store, read another--each bit must be worked with to actually be modeled. (But this process does not need to be strictly linear--to take his stupid pedestrian, they can separately learn speed, distance, stopping distance etc and tie them together. But if his pedestrian cares if the driver is inattentive his model is wrong--why are you doing something that involves a moving driver being aware of you in the fist place??) | ||