| ▲ | acidburnNSA 7 hours ago | |||||||||||||
* "Self-hosted: Runs entirely on your infrastructure. No data leaves your network." * "Bring Your Own LLM: Anthropic, OpenAI, Gemini, or open-weight models via vLLM." With so many newbies wanting these kinds of services it might be worth adjusting the first bullet to say: "No data leaves your network, at least as long as you don't use any Anthropic, OpenAI, or Gemini models via the network of course" | ||||||||||||||
| ▲ | prvnsmpth 7 hours ago | parent | next [-] | |||||||||||||
That's a good point, it might make sense to clarify that for individuals who want to self-host. I'll make the change, thanks! | ||||||||||||||
| ▲ | cjonas 7 hours ago | parent | prev [-] | |||||||||||||
Most organizations are going to be self hosting on aws, gcp or azure... So as long as you use their inference services as your LLM then you can keep it all within the private network | ||||||||||||||
| ||||||||||||||