▲ | yellowapple 2 months ago | |
> That would only apply if the child is exposed to it Not just the child, but anyone associated with the child. Classmates sharing it around school and gossiping about it, overbearing parents punishing the child for something the child didn't even do, predators identifying the child and seeking to turn the fictional images into reality... there are a lot of plausible angles for a fictional representation of a real person to produce tangible psychological or even physical harm, just by the mere existence of that representation. It's in a similar vein to so-called "revenge porn". Nobody was harmed in the creation of it (assuming that the persons in it consented to being in it), and yet the dissemination of it has clear negative impacts on those who did not consent to said dissemination. That all being to say: > I would further note that part of the reason to use the term CSAM is to emphasize that there is an actual child in actual danger that may need help. Creating pornographic works depicting a child who actually exists in the real world does indeed put that actual child in actual danger. That's why it'd be appropriate to call such works "CSAM". |