▲ | robotresearcher 11 hours ago | |||||||
> You and robotresearcher have still avoided answering this question. I have repeatedly explicitly denied the meaningfulness of the question. Understanding is a property ascribed by an observer, not possessed by a system. You may not agree, but you can’t maintain that I’m avoiding that question. It does not have an answer that matters; that is my specific claim. You can say a toaster understands toasting or you can not. There is literally nothing at stake there. | ||||||||
▲ | godelski 10 hours ago | parent [-] | |||||||
You said the LLMs are intelligent because they do tasks. But the claim is inconsistent with the toaster example. If a toaster isn't intelligent because I have to give it bread and press the button to start then how's that any different from giving an LLM a prompt and pressing the button to start? It's never been about the toaster. You're avoiding answering the question. I don't believe you're dumb, so don't act the part. I'm not buying it. | ||||||||
|