| ▲ | Weves 8 hours ago | |||||||
Thanks for the kind words! The key point there is that many would do it through Azure / Bedrock + locally host the open-source models. Also, all chats / indexed data lives on-prem, and there are better guarantees around retention when using the APIs directly. | ||||||||
| ▲ | bilekas 8 hours ago | parent | next [-] | |||||||
Ah I see.. That makes a bit more sense and definitely adds a value multiplier for enterprises I would imagine! I'll try out the open source one and see how it works out! | ||||||||
| ▲ | thinkloop 6 hours ago | parent | prev [-] | |||||||
Is running your llm through azure insecure? I mean more so than running anything on cloud? My understanding was that azure gpt instances were completely independent with the same security protocols as databases, vms, etc. | ||||||||
| ||||||||