▲ | petesergeant 3 days ago | |||||||||||||||||||||||||
With all these things, it depends on your own eval suite. gpt-oss-120b works as well as o4-mini over my evals, which means I can run it via OpenRouter on Cerebras where it's SO DAMN FAST and like 1/5th the price of o4-mini. | ||||||||||||||||||||||||||
▲ | indigodaddy 3 days ago | parent [-] | |||||||||||||||||||||||||
How would you compare gpt-oss-120b to (for coding): Qwen3-Coder-480B-A35B-Instruct GLM4.5 Air Kimi K2 DeepSeek V3 0324 / R1 0528 GPT-5 Mini Thanks for any feedback! | ||||||||||||||||||||||||||
|