| ▲ | grey-area 3 hours ago | ||||||||||||||||||||||
No it’s not like saying that, because that is not at all what humans do when they think. This is self-evident when comparing human responses to problems be LLMs and you have been taken in by the marketing of ‘agents’ etc. | |||||||||||||||||||||||
| ▲ | stavros 3 hours ago | parent [-] | ||||||||||||||||||||||
You've misunderstood what I'm saying. Regardless of whether LLMs think or not, the sentence "LLMs don't think because they predict the next token" is logically as wrong as "fleas can't jump because they have short legs". | |||||||||||||||||||||||
| |||||||||||||||||||||||