| ▲ | myrmidon 3 hours ago | |||||||||||||
If there is a glut of legal, AI generated CSAM material then this provides a lot of deniability for criminal creators/spreaders that cause genuine harm, and reduces "vigilance" of prosecutors, too ("it's probably just AI generated anyway..."). You could make a multitude of arguments against that perspective, but at least there is a conclusive reason for legal restrictions. > What is the criteria for this? My criteria would be victims suffering personally from the generated material. The "no harm" argument only really applies if victims and their social bubble never find out about the material (but that did happen, sometimes intentionally, in many cases). You could make the same argument that a hidden camera in a locker room never causes any harm as long as it stays undetected; that is not very convincing to me. | ||||||||||||||
| ▲ | guerrilla 10 minutes ago | parent | next [-] | |||||||||||||
> If there is a glut of legal, AI generated CSAM material then this provides a lot of deniability for criminal creators/spreaders that cause genuine harm, and reduces "vigilance" of prosecutors, too ("it's probably just AI generated anyway..."). > You could make a multitude of arguments against that perspective, but at least there is a conclusive reason for legal restrictions. I don't know about that. Would "I didn't know it was real" really count as a legal defense? | ||||||||||||||
| ||||||||||||||
| ▲ | Eisenstein 2 hours ago | parent | prev [-] | |||||||||||||
> You could make a multitude of arguments against that perspective, but at least there is a conclusive reason for legal restrictions. But that reason is highly problematic. Laws should be able to stand on their own for their reasons. Saying 'this makes enforcement of other laws harder' does not do that. You could use the same reasoning against encryption. > You could make the same argument that a hidden camera in a locker room never causes any harm as long as it stays undetected; that is not very convincing to me. I thought you were saying that the kids who were in the dataset that the model was trained on would be harmed. I agree with what I assume you meant based on your reply, which is people who had their likeness altered are harmed. | ||||||||||||||
| ||||||||||||||