| ▲ | BeetleB 5 hours ago | |||||||||||||||||||||||||
The two things I like about OpenRouter: 1. The LLM provider doesn't know it's you (unless you have personally identifiable information in your queries). If N people are accessing GPT-5.x using OpenRouter, OpenAI can't distinguish the people. It doesn't know if 1 person made all those requests, or N. 2. The ability to ensure your traffic is routed only to providers that claim not to log your inputs (not even for security purposes): https://openrouter.ai/docs/guides/routing/provider-selection... It's been forever since I played with LiteLLM. Can I get these with it? | ||||||||||||||||||||||||||
| ▲ | napoleond 5 hours ago | parent | next [-] | |||||||||||||||||||||||||
> It doesn't know if 1 person made all those requests, or N. FWIW this is highly unlikely to be true. It's true that the upstream provider won't know it's _you_ per se, but most LLM providers strongly encourage proxies like OpenRouter to distinguish between downstream clients for security and performance reasons. For example: - https://developers.openai.com/api/docs/guides/safety-best-pr... - https://developers.openai.com/api/docs/guides/prompt-caching... | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||
| ▲ | instalabsai 2 hours ago | parent | prev [-] | |||||||||||||||||||||||||
One additional major benefit of OpenRouter is that there is no rate limiting. This is the primary reason why we went with OpenRouter because of the tight rate limiting with the native providers. | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||