▲ | chpatrick 4 days ago | |||||||
Actually LLMs crunch through half a terabyte of weights before they "speak". How are you so confident that nothing happens in that immense amount of processing that has anything to do with thinking? Modern LLMs are also trained to have an inner dialogue before they output an answer to the user. When you type the next word you also put a word that fits some requirement. That doesn't mean you're not thinking. | ||||||||
▲ | leptons 4 days ago | parent [-] | |||||||
"crunch through half a terabyte of weights" isn't thinking. Following grammatical rules to produce a readable sentence isn't thought, it's statistics, and whether that sentence is factual or foolish isn't something the LLM cares about. If LLMs didn't so constantly produce garbage, I might agree with you more. | ||||||||
|