| ▲ | napoleond 5 hours ago | ||||||||||||||||
> It doesn't know if 1 person made all those requests, or N. FWIW this is highly unlikely to be true. It's true that the upstream provider won't know it's _you_ per se, but most LLM providers strongly encourage proxies like OpenRouter to distinguish between downstream clients for security and performance reasons. For example: - https://developers.openai.com/api/docs/guides/safety-best-pr... - https://developers.openai.com/api/docs/guides/prompt-caching... | |||||||||||||||||
| ▲ | BeetleB 5 hours ago | parent [-] | ||||||||||||||||
Fair point. Would be good to hear from OpenRouter folks on how they handle the safety identifier. For prompt caching, they already say they permit it, and do not consider it "logging" (i.e. if you have zero retention turned on, it will still go to providers who do prompt caching). | |||||||||||||||||
| |||||||||||||||||