| ▲ | danaris 2 months ago |
| If it's AI-generated, it is fundamentally not CSAM. The reason we shifted to the terminology "CSAM", away from "child pornography", is specifically to indicate that it is Child Sexual Abuse Material: that is, an actual child was sexually abused to make it. You can call it child porn if you really want, but do not call something that never involved the abuse of a real, living, flesh-and-blood child "CSAM". (Or "CSEM"—"Exploitation" rather than "Abuse"—which is used in some circles.) This includes drawings, CG animations, written descriptions, videos where such acts are simulated with a consenting (or, tbh, non consenting—it can be horrific, illegal, and unquestionably sexual assault without being CSAM) adult, as well as anything AI-generated. These kinds of distinctions in terminology are important, and yes I will die on this hill. |
|
| ▲ | skissane 2 months ago | parent | next [-] |
| This is a rather US-centric perspective. Under US law, there is a legal distinction between child pornography and child obscenity (see e.g. 18 U.S.C. § 1466A, “obscene visual representations of the sexual abuse of children”). The first is clearly CSAM; whether the second is, is open to dispute. But, in Canada, UK (and many other European countries), Australia, New Zealand, that legal distinction doesn’t exist, both categories are subsumed under child pornography (or equivalent terms-Australian law now prefers the phrase “child abuse material”), and the authorities in those jurisdictions aren’t going to say some child pornography is CSAM and the rest isn’t - they are going to say it is all CSAM |
| |
| ▲ | danaris 2 months ago | parent [-] | | You are speaking of legality, not terminology. The terminology is universal, as it is simply for talking about What People Are Doing, not What The Law Says. Many people will—and do, and this is why I'm taking pains to point it out—confuse and conflate CSAM and child pornography, and also the terminology and the law. That doesn't change anything about what I've said. Fundamentally, there are two basic reasons we outlaw or otherwise vilify these things: 1) Because the creation of CSAM involves the actual sexual abuse of actual children, which causes actual harm. 2) Because we think that child pornography is icky. Only the former has a basis in fundamental and universal principles. The latter is, effectively, attempting to police a thoughtcrime. Lots of places do attempt to police thoughtcrime, in various different ways (though they rarely think of it as such); that does not change the fact that this is what they are doing. | | |
| ▲ | skissane 2 months ago | parent [-] | | > The terminology is universal Is it? The US Department of Homeland Security defines "CSAM" as including generative AI images: https://www.dhs.gov/sites/default/files/2024-04/24_0408_k2p_... So does the FBI: https://www.ic3.gov/PSA/2024/PSA240329 You want to define the "CSAM" more narrowly, so as to exclude those images. I'm not aware of any "official" definition, but arguably something hosted on a US federal government website is "more official" than the opinion of a HN commenter | | |
| ▲ | danaris 2 months ago | parent [-] | | Sorry, I spoke imprecisely. The terminology is used outside of legal contexts in ways that transcend borders. To the best of my knowledge, the terms "CSAM" and "CSEM" were coined outside of legal contexts, for the purposes I described above. That they are used in legal contexts in particular jurisdictions with particular definitions that do not exactly match what I have described does not change what I have described for general usage. By the very nature of the English language, which has no formal administering body, there is no such thing as an "official definition" of a word in common usage; even dictionaries are descriptive, not normative. It is possible that my experience is not universal; however, I have had enough exposure to the term in a variety of contexts that I am comfortable stating that, at least for a large number of Anglophone people, my description of the situation would read as accurate. YMMV, void where prohibited, no warranty is expressed or implied, etc. | | |
| ▲ | skissane 2 months ago | parent [-] | | > The terminology is used outside of legal contexts in ways that transcend borders. To clarify, the FBI and DHS publications I cited are not actually using the term in a "legal context", strictly speaking. Presently, US federal criminal law does not use the term CSAM; if the FBI or DHS arrest someone for "CSAM", they might use that term in a press release describing the arrest, but the formal criminal charges will be expressed without using it. > To the best of my knowledge, the terms "CSAM" and "CSEM" were coined outside of legal contexts, for the purposes I described above. This is where I doubt you – did the people who originally coined the term "CSAM" intend to exclude AI-generated images from the term's scope? You are assuming they did, but I'm not convinced you are right. I suspect you may be projecting your own views about how the term should be defined on to the people who originated it. | | |
| ▲ | danaris 2 months ago | parent [-] | | > This is where I doubt you – did the people who originally coined the term "CSAM" intend to exclude AI-generated images from the term's scope? You are assuming they did, but I'm not convinced you are right. I suspect you may be projecting your own views about how the term should be defined on to the people who originated it. Knowing, as I do, the purpose (or at least the stated purpose) of coining the term—which, as I have very clearly stated, was to differentiate between situations where an actual real-life child is involved, vs those where none is: Yes, I am very confident that their intent would exclude AI-generated images. I can't see how any other conclusion could be drawn from what I've already said. |
|
|
|
|
|
|
| ▲ | yellowapple 2 months ago | parent | prev | next [-] |
| I think the one case where I'd disagree is when it's a depiction of an actual person - say, someone creates pornography (be it AI-generated, drawn, CG-animated, etc.) depicting a person who actually exists in the real world, and not just some invented character. That's certainly a case where it'd cross into actual CSAM/CSEM, because despite the child not physically being abused/exploited in the way depicted in the work, such a defamatory use of the child's likeness would constitute psychological abuse/exploitation. |
| |
| ▲ | danaris 2 months ago | parent [-] | | That would only apply if the child is exposed to it, either directly or indirectly—which, if it's distributed publicly, is a possibility, though far from a certainty. I would also say that there's enough difference between being sexually abused, in person, and having someone make a fake image of that, that it's at least questionable to apply the term. I would further note that part of the reason to use the term CSAM is to emphasize that there is an actual child in actual danger that may need help. | | |
| ▲ | yellowapple 2 months ago | parent [-] | | > That would only apply if the child is exposed to it Not just the child, but anyone associated with the child. Classmates sharing it around school and gossiping about it, overbearing parents punishing the child for something the child didn't even do, predators identifying the child and seeking to turn the fictional images into reality... there are a lot of plausible angles for a fictional representation of a real person to produce tangible psychological or even physical harm, just by the mere existence of that representation. It's in a similar vein to so-called "revenge porn". Nobody was harmed in the creation of it (assuming that the persons in it consented to being in it), and yet the dissemination of it has clear negative impacts on those who did not consent to said dissemination. That all being to say: > I would further note that part of the reason to use the term CSAM is to emphasize that there is an actual child in actual danger that may need help. Creating pornographic works depicting a child who actually exists in the real world does indeed put that actual child in actual danger. That's why it'd be appropriate to call such works "CSAM". |
|
|
|
| ▲ | ashleyn 2 months ago | parent | prev | next [-] |
| This is where my technical knowledge of genAI breaks down, but wouldn't an image generator be unable to produce such imagery unless honest-to-god CSAM were used in the training of it? |
| |
| ▲ | gs17 2 months ago | parent | next [-] | | It's like the early demo for DALL-E where you could get "an armchair in the shape of an avocado", which presumably wasn't in the training set, but enough was in it to generalize the "armchair" and "avocado" concepts and combine them. | |
| ▲ | 6ix8igth 2 months ago | parent | prev | next [-] | | It's possible for the model to take disparate concepts and put them together. E.g. you can train a LORA to teach stable diffusion what a cowboy hat it is, then ask for Dracula in a cowboy hat.that probably doesn't exist in it's training data, but it will give it to you just fine. I'm not about to try, but I would assume the same would apply for child pornography. | |
| ▲ | danaris 2 months ago | parent | prev [-] | | Not at all. If it's trained with images of children, and images of pornography, it should be pretty easy for it to combine the two. |
|
|
| ▲ | drinpt 2 months ago | parent | prev [-] |
| [flagged] |