| ▲ | zozbot234 2 hours ago | ||||||||||||||||||||||
This is satire, but the very notion of open source license obligations is meaningless in context. FLOSS licenses do not require you to publish your purely internal changes to the code; any publication happens by your choice, and given that AI can now supposedly engineer a clean-room reimplementation of any published program whatsoever, publishing your software with a proprietary copyright isn't going to exactly save you either. | |||||||||||||||||||||||
| ▲ | eru 2 hours ago | parent | next [-] | ||||||||||||||||||||||
No, no, some open source licenses require you to publish internal changes. Eg some are explicitly written that you have to publish even when you 'only' use the changes on your own servers. (Not having to publish that was seen as a loophole for cloud companies to exploit.) | |||||||||||||||||||||||
| |||||||||||||||||||||||
| ▲ | utopiah 2 hours ago | parent | prev | next [-] | ||||||||||||||||||||||
"given that AI can now supposedly engineer a clean-room reimplementation of any published program whatsoever" I'm missing something there, that's precisely what I'm arguing again. How can it do a clean-room reimplementation when the open source code is most likely in the training data? That only works if you would train on everything BUT the implementation you want. It's definitely feasible but wouldn't that be prohibitively expensive for most, if not all, projects? | |||||||||||||||||||||||
| ▲ | nearlyepic 2 hours ago | parent | prev [-] | ||||||||||||||||||||||
Am I right in thinking that is not even "clean room" in the way people usually think of it, e.g. Compaq? The "clean room" aspect for that came in the way that the people writing the new implementation had no knowledge of the original source material, they were just given a specification to implement (see also Oracle v. Google). If you're feeding an LLM GPL'd code and it "creates" something "new" from it, that's not "clean room", right? At the end of the day the supposed reimplementation that the LLM generates isn't copyrightable either so maybe this is all moot. | |||||||||||||||||||||||
| |||||||||||||||||||||||