Remix.run Logo
Weves 8 hours ago

Thanks for the kind words!

The key point there is that many would do it through Azure / Bedrock + locally host the open-source models. Also, all chats / indexed data lives on-prem, and there are better guarantees around retention when using the APIs directly.

bilekas 8 hours ago | parent | next [-]

Ah I see.. That makes a bit more sense and definitely adds a value multiplier for enterprises I would imagine! I'll try out the open source one and see how it works out!

thinkloop 6 hours ago | parent | prev [-]

Is running your llm through azure insecure? I mean more so than running anything on cloud? My understanding was that azure gpt instances were completely independent with the same security protocols as databases, vms, etc.

bilekas 4 hours ago | parent [-]

Azure wouldn't be if you have your company AD/Oauth, I'm GUESSING running local models with data transfer might expose that communication if your local machine is compromised, or someone else's, potentially is multiple points of leakage, companies generally like to limit that risk. This is all an assumption btw.

Edit : grammar