▲ | CrulesAll 20 hours ago | |
"You can't prove humans do either." Yes you can via results and cross examination. Humans are cybernetic systems(the science not the sci-fi). But you are missing the point. LLMs are code written by engineers. Saying LLMs understand text is the same as saying a chair understands text. LLMs' 'understanding' is nothing more than the engineers synthesizing linguistics. When I ask an A'I' the Capital of Ireland, it answers Dublin. It does not 'understand' the question. It recognizes the grammar according to an algorithm, and matches it against a probabilistic model given to it by an engineer based on training data. There is no understanding in any philosophical nor scientific sense. | ||
▲ | lordnacho 18 hours ago | parent [-] | |
> When I ask an A'I' the Capital of Ireland, it answers Dublin. It does not 'understand' the question. You can do this trick as well. Haven't you ever been to a class that you didn't really understand, but you can give correct answers? I've had this somewhat unsettling experience several times. Someone asks you a question, words come out of your mouth, the other person accepts your answer. But you don't know why. Here's a question you probably know the answer to, but don't know why: - I'm having steak. What type of red wine should I have? I don't know shit about Malbec, I don't know where it's from, I don't know why it's good for steak, I don't know who makes it, how it's made. But if I'm sitting at a restaurant and someone asks me about wine, I know the answer. |