| ▲ | prism56 7 hours ago |
| If the data is opensource on github, then in my opinion it should be fair game. |
|
| ▲ | ozgrakkurt 6 hours ago | parent | next [-] |
| IMO this is unfair for GPL or similarly licensed code. Seems ok for MIT like licensed code though |
| |
| ▲ | ForHackernews 5 hours ago | parent | next [-] | | It's totally fair to use GPL code, it just means all the models built by Anthropic, OpenAI, etc. using GPL-licensed source are themselves bound by the GPL. Plus, any works created downstream using those AI tools. We're on the verge of a golden age of software as soon as someone finds a court with courage. | | |
| ▲ | duskdozer 5 hours ago | parent [-] | | Ah, you have much more faith in the legal system than I do. It's nice to dream, though. |
| |
| ▲ | edg5000 4 hours ago | parent | prev [-] | | I think AI will create an open source dark age. Gradually, we'll see a lot less new good open source code. A gradual shift back to the proprietary world. Simmilar to the 1950-1990 period. |
|
|
| ▲ | notrealyme123 5 hours ago | parent | prev [-] |
| Things being public should not be enough. just because someone leaked your medical information to the public via a data breach should not make it fair game. There should be some rules. |
| |
| ▲ | prism56 5 hours ago | parent | next [-] | | I feel that's a false dichotomy. The code on github is freely available for people to read and learn from, leaked medical data isn't. | |
| ▲ | prism56 5 hours ago | parent | prev [-] | | I feel that's a flase dichotomy. The code visible on github is freely available for anyone to read and learn from. |
|