| ▲ | ssl-3 2 hours ago | ||||||||||||||||||||||
The question is so outlandish that it is something that nobody would ever ask another human. But if someone did, then they'd reasonably expect to get a response consisting 100% of snark. But the specificity required for a machine to deliver an apt and snark-free answer is -- somehow -- even more outlandish? I'm not sure that I see it quite that way. | |||||||||||||||||||||||
| ▲ | necovek an hour ago | parent | next [-] | ||||||||||||||||||||||
Humans ask each other silly questions all the time: a human confronted with a question like this would either blurb out a bad response like "walk" without thinking before realizing what they are suggesting, or pause and respond with "to get your car washed, you need to get it there so you must drive". Now, humans, other than not even thinking (which is really similar to how basic LLMs work), can easily fall victim to context too: if your boss, who never pranks you like this, asked you to take his car to a car wash, and asked if you'll walk or drive but to consider the environmental impact, you might get stumped and respond wrong too. (and if it's flat or downhill, you might even push the car for 50m ;)) | |||||||||||||||||||||||
| ▲ | shakna 2 hours ago | parent | prev | next [-] | ||||||||||||||||||||||
But the number of outlandish requests in business logic is countless. Like... In most accounting things, once end-dated and confirmed, a record should cascade that end-date to children and should not be able to repeat the process... Unless you have some data-cleaning validation bypass. Then you can repeat the process as much as you like. And maybe not cascade to children. There are more exceptions, than there are rules, the moment you get any international pipeline involved. | |||||||||||||||||||||||
| |||||||||||||||||||||||
| ▲ | coldtea 2 hours ago | parent | prev | next [-] | ||||||||||||||||||||||
>The question is so outlandish that it is something that nobody would ever ask another human There is an endless variety of quizes just like that humans ask other humans for fun, there is a whole lot of "trick questions" humans ask other humans to trip them up, and there are all kinds of seemingly normal questions with dumb assumptions quite close to that humans exchange. | |||||||||||||||||||||||
| ▲ | jstummbillig 2 hours ago | parent | prev | next [-] | ||||||||||||||||||||||
I'd be entirely fine with a humorous response. The Gemini flash answer that was posted somewhere in this thread is delightful. | |||||||||||||||||||||||
| ▲ | Agentlien 2 hours ago | parent | prev [-] | ||||||||||||||||||||||
I've used a few facetious comments in ChatGPT conversations. It invariably misses it and takes my words at face value. Even when prompted that there's sarcasm here which you missed, it apologizes and is unable to figure out what it's missing. I don't know if it's a lack of intellect or the post-training crippling it with its helpful persona. I suspect a bit of both. | |||||||||||||||||||||||