| ▲ | beardbound 2 hours ago | |
I mostly agree with you, but if a human straight up copies work under copyright they’re violating the law. Seems like a LLM should be held to the same standard unless they should be even less beholden to the law than people. It’s also incredibly hard to tell if a LLM copied something since you can’t ask it in court and it probably can’t even tell you if it did. | ||
| ▲ | ranger_danger 2 hours ago | parent [-] | |
From what I have seen, the (US) courts seem to make a distinction between 100% machine-automated output with no manual prompting at all, versus a human giving it specific instructions on what to generate. (And yes I realize everything a computer does requires prior instruction of some kind.) But the issue with copyright I think comes from the distribution of a (potentially derivative or transformative in the legal sense) work, which I would say is typically done manually by a human to some extent, so I think they would be on the hook for any potential violations in that case, possibly even if they cannot actually produce sources themselves since it was LLM-generated. But the legal test always seems to come back to what I said before, simply "how much was copied, and how obvious is it?" which is going to be up to the subjective interpretation of each judge of every case. | ||