▲ | yellowapple 2 months ago | |||||||
I think the one case where I'd disagree is when it's a depiction of an actual person - say, someone creates pornography (be it AI-generated, drawn, CG-animated, etc.) depicting a person who actually exists in the real world, and not just some invented character. That's certainly a case where it'd cross into actual CSAM/CSEM, because despite the child not physically being abused/exploited in the way depicted in the work, such a defamatory use of the child's likeness would constitute psychological abuse/exploitation. | ||||||||
▲ | danaris 2 months ago | parent [-] | |||||||
That would only apply if the child is exposed to it, either directly or indirectly—which, if it's distributed publicly, is a possibility, though far from a certainty. I would also say that there's enough difference between being sexually abused, in person, and having someone make a fake image of that, that it's at least questionable to apply the term. I would further note that part of the reason to use the term CSAM is to emphasize that there is an actual child in actual danger that may need help. | ||||||||
|