| ▲ | runarberg 3 hours ago | |||||||||||||||||||||||||
This is the first time I hear the term LLM cognition and I am horrified. LLMs don‘t have cognition. LLMs are a statistical inference machines which predict a given output given some input. There are no mental processes, no sensory information, and certainly no knowledge involved, only statistical reasoning, inference, interpolation, and prediction. Comparing the human mind to an LLM model is like comparing a rubber tire to a calf muscle, or a hydraulic system to the gravitational force. They belong in different categories and cannot be responsibly compared. When I see these tests, I presume they are made to demonstrate the limitation of this technology. This is both relevant and important that consumers know they are not dealing with magic, and are not being sold a lie (in a healthy economy a consumer protection agency should ideally do that for us; but here we are). | ||||||||||||||||||||||||||
| ▲ | Benjammer 3 hours ago | parent | next [-] | |||||||||||||||||||||||||
>They belong in different categories Categories of _what_, exactly? What word would you use to describe this "kind" of which LLMs and humans are two very different "categories"? I simply chose the word "cognition". I think you're getting hung up on semantics here a bit more than is reasonable. | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||
| ▲ | CamperBob2 3 hours ago | parent | prev [-] | |||||||||||||||||||||||||
You'll need to explain the IMO results, then. | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||