| ▲ | __mharrison__ 2 hours ago | |
I realize my comment was unclear. I use codex the CLI all the time, but generally with this invocation: `codex --full-auto -m gpt-5.2` However, when I use the 5.2codex model, I've found it to be very slow and worse (hard to quantify, but I preferred straight-up 5.2 output). | ||