▲ | danaris 3 hours ago | |||||||||||||
If it's AI-generated, it is fundamentally not CSAM. The reason we shifted to the terminology "CSAM", away from "child pornography", is specifically to indicate that it is Child Sexual Abuse Material: that is, an actual child was sexually abused to make it. You can call it child porn if you really want, but do not call something that never involved the abuse of a real, living, flesh-and-blood child "CSAM". (Or "CSEM"—"Exploitation" rather than "Abuse"—which is used in some circles.) This includes drawings, CG animations, written descriptions, videos where such acts are simulated with a consenting (or, tbh, non consenting—it can be horrific, illegal, and unquestionably sexual assault without being CSAM) adult, as well as anything AI-generated. These kinds of distinctions in terminology are important, and yes I will die on this hill. | ||||||||||||||
▲ | yellowapple a minute ago | parent | next [-] | |||||||||||||
I think the one case where I'd disagree is when it's a depiction of an actual person - say, someone creates pornography (be it AI-generated, drawn, CG-animated, etc.) depicting a person who actually exists in the real world, and not just some invented character. That's certainly a case where it'd cross into actual CSAM/CSEM, because despite the child not physically being abused/exploited in the way depicted in the work, such a defamatory use of the child's likeness would constitute psychological abuse/exploitation. | ||||||||||||||
▲ | ashleyn 3 hours ago | parent | prev [-] | |||||||||||||
This is where my technical knowledge of genAI breaks down, but wouldn't an image generator be unable to produce such imagery unless honest-to-god CSAM were used in the training of it? | ||||||||||||||
|