| ▲ | t23414321 3 days ago | |||||||||||||||||||||||||||||||
An LLM ..is calculated ..from language (or from things being said by humans before being true or not). It's not some antropomorfic process what using the word thinking would suggest (to sell well). > across multiple tokens - but how many ? how many of them happen in sole person life ? How many in some calculation ? Does it matter, if a calculation doesn't reflect it but stay all the same ? (conversation with.. a radio - would it have any sense ?) | ||||||||||||||||||||||||||||||||
| ▲ | Retric 3 days ago | parent [-] | |||||||||||||||||||||||||||||||
The general public have no issue saying a computer is thinking when you’re sitting there waiting for it to calculate a route or doing a similar process like selecting a chess move. The connotation is simply an internal process of indeterminate length rather than one of reflexive length. So they don’t apply it when a GPU is slinging out 120 FPS in a first person shooter. | ||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||