Remix.run Logo
tecoholic 7 hours ago

I use 2 cli - Codex and Amp. Almost every time I need a quick change, Amp finishes the task in the time it takes Codex to build context. I think it’s got a lot to do with the system prompt and a the “read loop” as well, amp would read multiple files in one go and get to the task, but codex would crawl the files almost one by one. Anyone noticed this?

nl 3 hours ago | parent | next [-]

Amp uses Gemini 3 Flash to explore code first. That's model is a great speed/intelligence trade-off especially for that use case.

sumedh 7 hours ago | parent | prev | next [-]

Which Gpt model and reasoning level did you use in Codex and Amp?

Generally I have noticed Gpt 5.2 codex is slower compared to Sonnet 4.5 in Claude Code.

nl 3 hours ago | parent [-]

Amp doesn't have a conventional model selector - you choose fast vs smart (I think that's what it is called).

In smart mode it explores with Gemini Flash and writes with Opus.

Opus is roughly the same speed as Codex, depending on thinking settings.

anukin 4 hours ago | parent | prev [-]

What is your general flow with amp? I plan to try it out myself and have been on the fences for a while.