| ▲ | christoff12 5 hours ago | |||||||
Asking a yes/no question implies the ability to handle either choice. | ||||||||
| ▲ | not_kurt_godel 5 hours ago | parent | next [-] | |||||||
This is a perfect example of why I'm not in any rush to do things agentically. Double-checking LLM-generated code is fraught enough one step at a time, but it's usually close enough that it can be course-corrected with light supervision. That calculus changes entirely when the automated version of the supervision fails catastrophically a non-trivial percent of the time. | ||||||||
| ▲ | Joker_vD 5 hours ago | parent | prev | next [-] | |||||||
Not when you're talking with humans, not really. Which is one of the reasons I got into computing in the first place, dangit! | ||||||||
| ▲ | efitz 5 hours ago | parent | prev | next [-] | |||||||
To an LLM, answering “no” and changing the mode of the chat window are discrete events that are not necessarily related. Many coding agents interpret mode changes as expressions of intent; Cline, for example, does not even ask, the only approval workflow is changing from plan mode to execute mode. So while this is definitely both humorous and annoying, and potentially hazardous based on your workflow, I don’t completely blame the agent because from its point of view, the user gave it mixed signals. | ||||||||
| ||||||||
| ▲ | Lerc 5 hours ago | parent | prev | next [-] | |||||||
But I think if you sit down and really consider the implications of it and what yes or not actually means in reality, or even a overabundance of caution causing extraneous information to confuse the issue enough that you don't realise that this sentence is completely irrelevant to the problem at hand and could be inserted by a third party, yet the AI is the only one to see it. I agree. | ||||||||
| ▲ | wongarsu 5 hours ago | parent | prev [-] | |||||||
It's meant as a "yes"/"instead, do ..." question. When it presents you with the multiple choice UI at that point it should be the version where you either confirm (with/without auto edit, with/without context clear) or you give feedback on the plan. Just telling it no doesn't give the model anything actionable to do | ||||||||
| ||||||||