| ▲ | otabdeveloper4 3 hours ago | |||||||
When OP wrote about LLMs "thinking" he implied that they have an internal conceptual self-reflecting state. Which they don't, they *are* merely next token predicting statistical machines. | ||||||||
| ▲ | rafram 2 hours ago | parent [-] | |||||||
This was true in 2023. | ||||||||
| ||||||||