Remix.run Logo
scosman 7 hours ago

Yeah I mention that in the question.

Might still be valid for closed source projects (probably is).

I think courts would need to weigh in on the open source side. There’s legal precedent is that you can use a derived work to generate a new unique work (the spec derived for the copyrighted code is very much a derived work). There are rulings that LLMs are transformative works, not just copies of training data.

LLMs can’t reproduce their entire training set. But this thinking is also ripe for misuse. I could always train or fine-tune a model on the original work so that it can reproduce the original. We quickly get into statistical arguments here.

It’s a really interesting question.

swiftcoder 4 hours ago | parent | next [-]

> There’s legal precedent is that you can use a derived work to generate a new unique work (the spec derived for the copyrighted code is very much a derived work)

Indeed, but in the clean room scenario, the party who implements the spec has to be a separate entity that has never seen the code. Whether or not the LLM is copyright infringing is a separate question - it definitely has (at least some) familiarity with the code in question, which makes the "clean room" argument an uphill battle

jacquesm 5 hours ago | parent | prev [-]

I just wrote a long comment about that, but yes, you are on to something here.

The key to me is that the LLM itself is a derived work and that by definition it can not produce something original. Which in turn would make profiting off such a derived work created by an automated process from copyrighted works a case of wholesale copyright infringement. If you can get a judge to agree on that I predict the price of RAM will come down again.