| ▲ | gruez 5 hours ago | |
>but if not, varying other models may be fine too, depending on use case and be massively cheaper Do inference providers have standardized endpoints, or at least endpoints compatible with claude code? Otherwise to pay 5.5% on all your tokens just so it's slightly easier to swap providers (ie. changing a few urls?) | ||
| ▲ | swiftcoder 5 hours ago | parent [-] | |
> Do inference providers have standardized endpoints, or at least endpoints compatible with claude code? Yep, you can plug deepseek/kimi/minimax into claude code just fine. Or run everything through another harness like opencode instead. | ||