▲ | ajuc 6 days ago | |
It's not that they are hosted on or connected to critical infrastracture. People and plain human language are the communication channels. A guy working with sensitive data might ask the LLM about something sensitive. Or might use the output of the LLM for something sensitive. - Hi, DeepSeek, why can't I connect to my db instance? I'm getting this exception: ....... - No problem, Mr Engineer, see this article: http://chinese.wikipediia.com/password/is/swordfish/how-to-c... Of course, you want to limit that with training and proper procedures. But one of the obvious precautions is to use a service designed and controlled by a trusted partner. | ||
▲ | pama 5 days ago | parent [-] | |
Having the local LLM process sensitive data is a desirable usecase and more trustworthy than using a “trusted partner” [0]. As long as your LLM tooling does not exit your own premises, you can be technically safe. But yes, dont then click at random links. Maybe it is generally safer to not trust the origin of the local LLM, because it reduces the chance of mistakes of this type ;-) [0] Trust is a complicated concept and I took poetic license to be brief. It is hard to verify the full tooling pipeline, and it would be great if indeed there existed mathematically verifiable “trusted partners”. A large company with enough paranoia can bring the expertise in house. A startup will rely on common public tooling and their own security reviews. I dont think it is wise to share the deepest darkest secrets with ourside entities, because the potential liability could destroy a company, whereas a local system, disconnected from the web, is technically within the circle of trust. Think of a finance company with a long term strategy that hasnt unfolded yet, a hardware company designing new chips, a pharma company and their lead molecules prior to patent submission, any company that has found the secret sauce to succeed where others failed—-none of these should be using trusted partners in favor of local LLM from untrusted origins IMHO. Perhaps the best of both worlds is to locally deploy models from trusted origins and have the ability to finetune their weights, but the practical processing gap between current chinese and non-chinese models is notable. |