Remix.run Logo
bluefirebrand 3 days ago

> I've been looking for a copy-left "source available" license that allows me to distribute code openly but has a clause that says "if you would like to use these sources to train an LLM, please contact me and we'll work something out". I haven't yet found that

Frankly do you think AI companies have even the remotest amount of respect for these licenses anyways? They will simply take your code if it is publicly scrapeable, train their models, exactly like they have so far. Then it will be up to you to chase them down and try to sue or whatever. And good luck proving the license violation

I dunno. I just don't really believe that many tech companies these days are behaving even remotely ethically. I don't have much hope that will change anytime soon

apatheticonion 3 days ago | parent | next [-]

I wonder if there is a "loaded" lawsuit here that could be a win-win for license enforcement case law in LLMs.

Take a litigious company like Nintendo. If one was to train an LLM on their works and the LLM produces an emulator, that would force a lawsuit.

If Nintendo wins, then LLMs are stealing. If Nintendo loses, then we can decompile everything.

bluefirebrand 3 days ago | parent [-]

You're forgetting the option where the LLM companies pay Nintendo a silly amount of money for permission and Nintendo's executives take that as a win

archagon 3 days ago | parent | prev [-]

Traditionally, large corporations have taken very conservative legal stances with regard to integrating e.g. A/GPL code, even when there's almost no risk.

If my license explicitly says "any LLM output trained on this code is legally tainted," I feel like BigAICorp would be foolish to ignore it. Maybe I couldn't sue them today, but are they confident this will remain the case 5, 10, 20 years from now? Everywhere in the world?

63stack 2 days ago | parent [-]

Github has posted that they will now train on everyone's data (even private) unless you opt out (until they change their mind on that). Anthropic has been training on your data on certain tiers already. Meta bittorrented books to train their models.

Surely if your license says "LLM output trained on this code is legally tainted", it is going to dissuade them.

archagon 2 days ago | parent [-]

No, it won’t dissuade them. But when we finally get the chance to legally beat the shit out of these companies, I want to reserve my place in line.

Alternatively, they can learn to trust me on this and simply exclude/evict my code from the training corpus.