| ▲ | alienbaby 2 hours ago | |
I'm curious in what kinda if situations you are seeing the model the do opposite of your intention consistently where the instructions were not complex. Do you have any examples? | ||
| ▲ | avereveard 2 hours ago | parent [-] | |
Mostly gemini 3 pro when I ask to investigate a bug and provide fixing options (i do this mostly so i can see when the model loaded the right context for large tasks) gemini immediately starts fixing things and I just cant trust it Codex and claude give a nice report and if I see they're not considering this or that I can tell em. | ||