| ▲ | DangitBobby a day ago | ||||||||||||||||||||||||||||||||||||||||
Or the original point doesn't actually hold up to basic scrutiny and is indistinguishable from straw itself. | |||||||||||||||||||||||||||||||||||||||||
| ▲ | jacquesm a day ago | parent | next [-] | ||||||||||||||||||||||||||||||||||||||||
HN has guidelines for a reason. | |||||||||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||||||||
| ▲ | tovej 18 hours ago | parent | prev [-] | ||||||||||||||||||||||||||||||||||||||||
The original point, that LLMs are plagiarising inputs, is a very common and common sense opinion. There are court cases where this is being addressed currently, and if you think about how LLMs operate, a reasonable person typically sees that it looks an awful lot like plagiarism. If you want to claim it is not plagiarism, that requires a good argument, because it is unclear that LLMs can produce novelty, since they're literally trying to recreate the input data as faithfully as possible. | |||||||||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||||||||