| ▲ | yanis_t 2 days ago |
| Already on Openrouter. Pro version is $1.74/m/input, $3.48/m/output, while flash $0.14/m/input, 0.28/m/output. |
|
| ▲ | esafak 2 days ago | parent | next [-] |
| https://openrouter.ai/deepseek/deepseek-v4-pro https://openrouter.ai/deepseek/deepseek-v4-flash |
| |
| ▲ | 77ko 2 days ago | parent [-] | | Its on OR - but currently not available on their anthropic endpoint. OR if you read this, pls enable it there! I am using kimi-2.6 with Claude Code, works well, but Deepseek V4 gives an error: `https://openrouter.ai/api/messages with model=deepseek/deepseek-v4-pro, OR returns
an error because their Anthropic-compat translator doesn't cover V4 yet. The Claude CLI dutifully surfaces that error as "model...does not exist" |
|
|
| ▲ | nl 2 days ago | parent | prev | next [-] |
| The Pro model is giving 429 Overload errors |
| |
|
| ▲ | astrod 2 days ago | parent | prev [-] |
| Getting 'Api Error' here :(
Every other model is working fine. |
| |
| ▲ | poglet 2 days ago | parent [-] | | Try interacting with it through the website, it will give an error and some explanation on the issue. I had to relax my guardrail settings. |
|