| ▲ | mfabbri77 3 hours ago | |||||||||||||||||||
This has the potential to kill open source, or at least the most restrictive licenses (GPL, AGPL, ...): if a license no longer protects software from unwanted use, the only possible strategy is to make the development closed source. | ||||||||||||||||||||
| ▲ | abrookewood 18 minutes ago | parent | next [-] | |||||||||||||||||||
It's not just open source, it is literally anything source-available, whether intentional or not. | ||||||||||||||||||||
| ▲ | _dwt 3 hours ago | parent | prev | next [-] | |||||||||||||||||||
Yes, this is the reason I've completely stopped releasing any open-source projects. I'm discovering that newer models are somewhat capable of reverse-engineering even compiled WebAssembly, etc. too, so I can feel a sort of "dark forest theory" taking hold. Why publish anything - open or closed - to be ripped off at negligible marginal cost? | ||||||||||||||||||||
| ||||||||||||||||||||
| ▲ | user34283 15 minutes ago | parent | prev [-] | |||||||||||||||||||
I find the wording "protect from unwanted use" interesting. It is my understanding that what a GPL license requires is releasing the source code of modifications. So if we assume that a rewrite using AI retains the GPL license, it only means the rewrite needs to be open source under the GPL too. It doesn't prevent any unwanted use, or at least that is my understanding. I guess unwanted use in this case could mean not releasing the modifications. | ||||||||||||||||||||