▲ | danaris 2 months ago | |||||||||||||||||||||||||
You are speaking of legality, not terminology. The terminology is universal, as it is simply for talking about What People Are Doing, not What The Law Says. Many people will—and do, and this is why I'm taking pains to point it out—confuse and conflate CSAM and child pornography, and also the terminology and the law. That doesn't change anything about what I've said. Fundamentally, there are two basic reasons we outlaw or otherwise vilify these things: 1) Because the creation of CSAM involves the actual sexual abuse of actual children, which causes actual harm. 2) Because we think that child pornography is icky. Only the former has a basis in fundamental and universal principles. The latter is, effectively, attempting to police a thoughtcrime. Lots of places do attempt to police thoughtcrime, in various different ways (though they rarely think of it as such); that does not change the fact that this is what they are doing. | ||||||||||||||||||||||||||
▲ | skissane 2 months ago | parent [-] | |||||||||||||||||||||||||
> The terminology is universal Is it? The US Department of Homeland Security defines "CSAM" as including generative AI images: https://www.dhs.gov/sites/default/files/2024-04/24_0408_k2p_... So does the FBI: https://www.ic3.gov/PSA/2024/PSA240329 You want to define the "CSAM" more narrowly, so as to exclude those images. I'm not aware of any "official" definition, but arguably something hosted on a US federal government website is "more official" than the opinion of a HN commenter | ||||||||||||||||||||||||||
|