Remix.run Logo
chpatrick 4 days ago

Actually LLMs crunch through half a terabyte of weights before they "speak". How are you so confident that nothing happens in that immense amount of processing that has anything to do with thinking? Modern LLMs are also trained to have an inner dialogue before they output an answer to the user.

When you type the next word you also put a word that fits some requirement. That doesn't mean you're not thinking.

leptons 4 days ago | parent [-]

"crunch through half a terabyte of weights" isn't thinking. Following grammatical rules to produce a readable sentence isn't thought, it's statistics, and whether that sentence is factual or foolish isn't something the LLM cares about. If LLMs didn't so constantly produce garbage, I might agree with you more.

chpatrick 4 days ago | parent [-]

They don't follow "grammatical rules", they process inputs with an incredibly large neural net. It's like saying humans aren't really thinking because their brains are made of meat.