| ▲ | Aurornis 3 hours ago | |
I'm referring to hosted models such as via OpenRouter or from the model providers' own services. I think everyone making claims that inference is getting more expensive are unaware that there are more LLM providers than Google, Anthropic, and OpenAI. | ||