▲ | nradov a day ago | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
I don't think you actually grasp what you're writing either, or at least you can't explain it in any coherent way, so LLMs are no worse on that score. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
▲ | mallowdram a day ago | parent [-] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
There are vast, unexplored sums of knowledge that LLMs can't automate, mimic or regurgitate. This is obvious simply from meme-culture. Try asking Chat GPT to derive meaning from Kirk's shooters casing engravings. Listen to the explanations unravel. Once you attach the nearly limitless loads of meaning available to event-perception (use cognitive mapping in neuroscience where behavior has no meaning, it simply has tasks and task demands vary wildly so that semantic loads are factors rather than simple numbers), LLMs appear to be like puppets of folk psychology using tokens predictably in embedded space. These tokens have nothing to do with the reality of knowledge or events. Of course engineers can't grasp this, you've been severely limited to using folk psychology infected cog sci as a base of where your code is developed from, when in reality, it's almost totally illusory. CS has no future game in probability, it's now a bureaucracy. The millions or billions of parameters have zero access to problems like these that sit beyond cog sci, I'll let Kelso zing it https://drive.google.com/file/d/1oK0E4siLUv9MFCYuOoG0Jir_65T... | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|