| ▲ | Retric 3 days ago | ||||||||||||||||||||||||||||||||||||||||
Thinking is different than forming long term memories. An LLM could be thinking in one of two ways. Either between adding each individual token, or collectively across multiple tokens. At the individual token level the physical mechanism doesn’t seem to fit the definition being essentially reflexive action, but across multiple tokens that’s a little more questionable especially as multiple approaches are used. | |||||||||||||||||||||||||||||||||||||||||
| ▲ | t23414321 3 days ago | parent [-] | ||||||||||||||||||||||||||||||||||||||||
An LLM ..is calculated ..from language (or from things being said by humans before being true or not). It's not some antropomorfic process what using the word thinking would suggest (to sell well). > across multiple tokens - but how many ? how many of them happen in sole person life ? How many in some calculation ? Does it matter, if a calculation doesn't reflect it but stay all the same ? (conversation with.. a radio - would it have any sense ?) | |||||||||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||||||||