▲ | mallowdram 2 days ago | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
There are vast, unexplored sums of knowledge that LLMs can't automate, mimic or regurgitate. This is obvious simply from meme-culture. Try asking Chat GPT to derive meaning from Kirk's shooters casing engravings. Listen to the explanations unravel. Once you attach the nearly limitless loads of meaning available to event-perception (use cognitive mapping in neuroscience where behavior has no meaning, it simply has tasks and task demands vary wildly so that semantic loads are factors rather than simple numbers), LLMs appear to be like puppets of folk psychology using tokens predictably in embedded space. These tokens have nothing to do with the reality of knowledge or events. Of course engineers can't grasp this, you've been severely limited to using folk psychology infected cog sci as a base of where your code is developed from, when in reality, it's almost totally illusory. CS has no future game in probability, it's now a bureaucracy. The millions or billions of parameters have zero access to problems like these that sit beyond cog sci, I'll let Kelso zing it https://drive.google.com/file/d/1oK0E4siLUv9MFCYuOoG0Jir_65T... | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
▲ | nradov a day ago | parent [-] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
No one — human or LLM — actually knows the meanings of the phrases that Tyler James Robinson wrote on his cartridge casings. There's lots of speculation but he isn't talking, and even if he was we wouldn't know whether he was telling the truth. If you want us to take you seriously then you'll have to come up with a valid example instead of posting a bunch of pseudo-intellectual drivel. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|