▲ | axdsk 3 days ago | |
The polygraph is a good example. The "lie detector" is used to misguide people, the polygraph is used to measure autonomic arousal. I think these misnomers can cause real issues like thinking the LLM is "reasoning". | ||
▲ | dexterlagan 3 days ago | parent [-] | |
Agreed, but in the case of the lie detector, it seems it's a matter of interpretation. In the case of LLMs, what is it? Is it a matter of saying "It's a next-word calculator that uses stats, matrices and vectors to predict output" instead of "Reasoning simulation made using a neural network"? Is there a better name? I'd say it's "A static neural network that outputs a stream of words after having consumed textual input, and that can be used to simulate, with a high level of accuracy, the internal monologue of a person who would be thinking about and reasoning on the input". Whatever it is, it's not reasoning, but it's not a parrot either. |