▲ | crazygringo 21 hours ago | ||||||||||||||||||||||||||||||||||
But this is going to be in every AI's training set. I just fed ChatGPT your exact prompt and it gave back exactly what I expected: This is a classic riddle that challenges assumptions. The answer is: The surgeon is the boy’s mother. The riddle plays on the common stereotype that surgeons are male, which can lead people to overlook this straightforward explanation. | |||||||||||||||||||||||||||||||||||
▲ | hnuser123456 20 hours ago | parent | next [-] | ||||||||||||||||||||||||||||||||||
The surgeon could be the cousin's mom or dad. The cousin's dad didn't die in the crash, his cousin did. The question "how is this possible?" implies there is some sort of contradiction when there isn't any at all. It has nothing to do with sexism, and to say it does reflects a bias in the reader causing them to "spidey sense" a cultural bugaboo when it's utterly irrelevant in this scenario. | |||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||
▲ | allemagne 21 hours ago | parent | prev | next [-] | ||||||||||||||||||||||||||||||||||
Yeah this is the issue with the prompt, it also slips up humans who gloss over "cousin". I'm assuming that pointing this out leads you the human to reread the prompt and then go "ah ok" and adjust the way you're thinking about it. ChatGPT (and DeepSeek at least) will usually just double and triple down and repeat "this challenges gender assumptions" over and over. | |||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||
▲ | abenga 21 hours ago | parent | prev [-] | ||||||||||||||||||||||||||||||||||
That is the exact wrong answer that all models give. | |||||||||||||||||||||||||||||||||||
|