| ▲ | Chance-Device 9 hours ago | |||||||
This is a weird article. How many times in your career have you been handed a grossly under-specified feature and had to muddle your way through, asking relevant people along the way and still being told at the end that it’s wrong? This is exactly the same thing but for AIs. The user might think that the AI got it wrong, except the spec was under-specified and it had to make choices to fill in the gaps, just like a human would. It’s all well and good if you don’t actually know what you want and you’re using the AI to explore possibilities, but if you already have a firm idea of what you want, just tell it in detail. Maybe the article is actually about bad specs? It does seem to venture into that territory, but that isn’t the main thrust. Overall I think this is just a part of the cottage industry that’s sprung up around agile, and an argument for that industry to stay relevant in the age of AI coding, without being well supported by anything. | ||||||||
| ▲ | LunicLynx 9 hours ago | parent [-] | |||||||
I sometimes wonder how many comments here are driving a pro AI narrative. This very much seems like one of those: The agent here is: Look on HN for AI skeptical posts. Then write a comment that highlights how the human got it wrong. And command your other AI agents to up vote that reply. | ||||||||
| ||||||||