| ▲ | Havoc 4 hours ago | |||||||||||||||||||||||||||||||||||||||||||
Does anyone know whether the cache is segregated by user/API key for the big providers? Was looking at modifying outgoing requests via proxy and wondering whether that's harming caching. Common coding tools presumably have a shared prompt across all their installs so universal cache would save a lot | ||||||||||||||||||||||||||||||||||||||||||||
| ▲ | moebrowne 4 hours ago | parent | next [-] | |||||||||||||||||||||||||||||||||||||||||||
For ChatGPT: > Prompt caches are not shared between organizations. Only members of the same organization can access caches of identical prompts. https://platform.openai.com/docs/guides/prompt-caching#frequ... | ||||||||||||||||||||||||||||||||||||||||||||
| ▲ | samwho 4 hours ago | parent | prev [-] | |||||||||||||||||||||||||||||||||||||||||||
I was wondering about this when I was reading around the topic. I can’t personally think of a reason you would need to segregate, though it wouldn’t surprise me if they do for some sort of compliance reasons. I’m not sure though, would love to hear something first-party. | ||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||