Remix.run Logo
conartist6 6 days ago

In that universe, there is no accountability at all. There is no way the system of law would allow that.

I admit that in my message I was careful not to make an accusation. I only stated my disposition towards their actions.

michaelmior 6 days ago | parent [-]

IANAL, but I don't believe that lack of intent to cause harm means that relief can't be granted. You could still take legal action if they refused to remove the information. To my knowledge, a question that still hasn't been thoroughly tested from a legal perspective is to what degree users of LLMs should be reasonably expected to be aware of the potentially for false information and to what degree continued use despite that knowledge constitutes willful negligence.

Humans can make mistakes too when compiling information and when mistakes are done unintentionally without the intent of causing harm, I believe liability is typically limited. I would expect the same would be true of LLM use. As long as the user of the LLM has taken reasonable precautions to ensure accuracy, I think liability probably should be limited in most cases. In the case of DeepWiki getting something wrong, I think the case for significant reputational damage is pretty weak.

conartist6 6 days ago | parent [-]

It's reasonable precautions where it seems to me they are likely to be what I would consider partially to wholly negligent. The lack of an ability to opt out, for example.

michaelmior 4 days ago | parent [-]

Agreed that there should be an ability to opt out.