Remix.run Logo
simonw 2 days ago

One of the most interesting coding agents to run locally is actually OpenAI Codex, since it has the ability to run against their gpt-oss models hosted by Ollama.

  codex --oss -m gpt-oss:20b
Or 120b if you can fit the larger model.
AlexCoventry 2 days ago | parent [-]

What do you find interesting about it, and how does it compare to commercial offerings?

simonw 2 days ago | parent [-]

It's rare to find a local model that's capable of running tools in a loop well enough to power a coding agent.

I don't think gpt-oss:20b is strong enough to be honest, but 120b can do an OK job.

Nowhere NEAR as good as the big hosted models though.

ontouchstart 2 days ago | parent | next [-]

Think of it as the early years of UNIX & PC. Running inferences and tools locally and offline opens doors to new industries. We might not even need client/server paradigm locally. LLM is just a probabilistic library we can call.

AlexCoventry 2 days ago | parent | prev [-]

Thanks.