| ▲ | simianwords 2 days ago | |
>And I do not agree. LLMs are literally incapable of understanding the concept of truth, right/wrong, knowledge and not-knowledge. It seems pretty crucial to be able to tell if you know something or not for any level of human-level intelligence. How are you so sure about this? > If one believes LLMs are capable of cognition, honestly asking: what formal proof is there for our own cognition? | ||