Remix.run Logo
skissane 2 months ago

This is a rather US-centric perspective. Under US law, there is a legal distinction between child pornography and child obscenity (see e.g. 18 U.S.C. § 1466A, “obscene visual representations of the sexual abuse of children”). The first is clearly CSAM; whether the second is, is open to dispute. But, in Canada, UK (and many other European countries), Australia, New Zealand, that legal distinction doesn’t exist, both categories are subsumed under child pornography (or equivalent terms-Australian law now prefers the phrase “child abuse material”), and the authorities in those jurisdictions aren’t going to say some child pornography is CSAM and the rest isn’t - they are going to say it is all CSAM

danaris 2 months ago | parent [-]

You are speaking of legality, not terminology.

The terminology is universal, as it is simply for talking about What People Are Doing, not What The Law Says.

Many people will—and do, and this is why I'm taking pains to point it out—confuse and conflate CSAM and child pornography, and also the terminology and the law. That doesn't change anything about what I've said.

Fundamentally, there are two basic reasons we outlaw or otherwise vilify these things:

1) Because the creation of CSAM involves the actual sexual abuse of actual children, which causes actual harm.

2) Because we think that child pornography is icky.

Only the former has a basis in fundamental and universal principles. The latter is, effectively, attempting to police a thoughtcrime. Lots of places do attempt to police thoughtcrime, in various different ways (though they rarely think of it as such); that does not change the fact that this is what they are doing.

skissane 2 months ago | parent [-]

> The terminology is universal

Is it? The US Department of Homeland Security defines "CSAM" as including generative AI images: https://www.dhs.gov/sites/default/files/2024-04/24_0408_k2p_...

So does the FBI: https://www.ic3.gov/PSA/2024/PSA240329

You want to define the "CSAM" more narrowly, so as to exclude those images.

I'm not aware of any "official" definition, but arguably something hosted on a US federal government website is "more official" than the opinion of a HN commenter

danaris 2 months ago | parent [-]

Sorry, I spoke imprecisely.

The terminology is used outside of legal contexts in ways that transcend borders. To the best of my knowledge, the terms "CSAM" and "CSEM" were coined outside of legal contexts, for the purposes I described above.

That they are used in legal contexts in particular jurisdictions with particular definitions that do not exactly match what I have described does not change what I have described for general usage.

By the very nature of the English language, which has no formal administering body, there is no such thing as an "official definition" of a word in common usage; even dictionaries are descriptive, not normative. It is possible that my experience is not universal; however, I have had enough exposure to the term in a variety of contexts that I am comfortable stating that, at least for a large number of Anglophone people, my description of the situation would read as accurate.

YMMV, void where prohibited, no warranty is expressed or implied, etc.

skissane 2 months ago | parent [-]

> The terminology is used outside of legal contexts in ways that transcend borders.

To clarify, the FBI and DHS publications I cited are not actually using the term in a "legal context", strictly speaking. Presently, US federal criminal law does not use the term CSAM; if the FBI or DHS arrest someone for "CSAM", they might use that term in a press release describing the arrest, but the formal criminal charges will be expressed without using it.

> To the best of my knowledge, the terms "CSAM" and "CSEM" were coined outside of legal contexts, for the purposes I described above.

This is where I doubt you – did the people who originally coined the term "CSAM" intend to exclude AI-generated images from the term's scope? You are assuming they did, but I'm not convinced you are right. I suspect you may be projecting your own views about how the term should be defined on to the people who originated it.

danaris 2 months ago | parent [-]

> This is where I doubt you – did the people who originally coined the term "CSAM" intend to exclude AI-generated images from the term's scope? You are assuming they did, but I'm not convinced you are right. I suspect you may be projecting your own views about how the term should be defined on to the people who originated it.

Knowing, as I do, the purpose (or at least the stated purpose) of coining the term—which, as I have very clearly stated, was to differentiate between situations where an actual real-life child is involved, vs those where none is: Yes, I am very confident that their intent would exclude AI-generated images. I can't see how any other conclusion could be drawn from what I've already said.