| ▲ | simianwords 6 hours ago | |
It’s the degree of generalisability. And LLMs do have understanding. You can ask it how it came up with the process in natural language and it can help - something a calculator can’t do. | ||
| ▲ | bigfishrunning 5 minutes ago | parent [-] | |
> And LLMs do have understanding. They absolutely do not. If you "ask it how it came up with the process in natural language" with some input, it will produce an output that follows, because of the statistics encoded in the model. That output may or may not be helpful, but it is likely to be stylistically plausible. An LLM does not think or understand; it is merely a statistical model (that's what the M stands for!) | ||