| ▲ | InsideOutSanta 20 hours ago | ||||||||||||||||
You're literally telling me that the thing that has happened on my computer in front of my own eyes has not happened. | |||||||||||||||||
| ▲ | throw310822 20 hours ago | parent [-] | ||||||||||||||||
If you mean "once in a thousand times an LLM will do something absolutely stupid" then I agree, but the exact same applies to human beings. In general LLMs show excellent understanding of the context and actual intents, they're completely different from our stereotype of blind algorithmic intelligence. Btw, were you using codex by any chance? There was a discussion a few days ago where people reported that it follows instruction in an extremely literal fashion, sometimes to absurd outcomes such as the one you describe. | |||||||||||||||||
| |||||||||||||||||