▲ | Workaccount2 a day ago | |
Or it's assuming you are asking about Marathon Valley, which is very reasonable given the context. Ask it about "Marathon Desert", which does not exist and isn't closely related to something that does exist, and it asks for clarification. I'm not here to say LLMs are oracles of knowledge, but I think the need to carefully craft specific "gotcha" questions in order to generate wrong answers is a pretty compelling case in the opposite direction. Like the childhood joke of "Whats up?"..."No, you dummy! The sky is!" Straightforward questions with straight wrong answers are far more interesting. I don't many people ask LLMs trick questions all day. | ||
▲ | krainboltgreene 19 hours ago | parent [-] | |
If someone asked me or my kid "What do you know about Mt. Olampus." we wouldn't reply: "Oh, Mt. Olampus is a big mountain in greek myth...". We'd say "Wait, did you mean Mt. Olympus?" It doesn't "assume" anything, because it can't assume, that's now the machine works. |