▲ | Workaccount2 2 days ago | |
They will just put a dumb copyright filter on the output, a la YouTube or other hosting services. Again, it's illegal for artists to recreate copyright, it's not illegal for them to see it or know it. It's not like you cannot hire a guy because he can perfectly visualize Pikachu in his head. The conflation of training on copyright being equivalent to distribution of copyright is so disingenuous, and thankfully the courts so far recognize that. | ||
▲ | DiabloD3 2 days ago | parent [-] | |
YouTube et al's copyright detection is mostly nonfunctional. It can only match exactly the same input with very little leeway. Even resizing it to a wrong ratio, or changing audio sampling rate too far fucks up the detection. Its illegal for artists to distribute recreated copyright in a way that is not transformative. It isn't illegal to produce it and keep it to themselves. People also distribute models, they don't merely offer them as a service. However, if someone asks their model to produce a copyright violation, and it does so, the person that created and distributed the model (its the distribution that is the problem), the service that ran it (assuming it isn't local inference), and the person that asked for the violation to be created can all be looped into the legal case. This has happened before, before the world of AI. Even companies that 100% participated in the copyright regime, quickly performed takedowns, ran copyright detection to the best of their ability were sued and they lost because their users committed copyright violation using their services, even though the company did everything right and absolutely above board. The law is stacked against service providers on the Internet, as it essentially requires them to be omniscient and omnipotent. Such requirements are not levied against other service providers in other industries. |