| ▲ | logicchains a day ago |
| The point of banning real CSAM is to stop the production of it, because the production is inherently harmful. The production of AI or human generated CSAM-like images does not inherently require the harm of children, so it's fundamentally a different consideration. That's why some countries, notably Japan, allow the production of hand-drawn material that in the US would be considered CSAM. |
|
| ▲ | cwillu a day ago | parent | next [-] |
| If libeling real people is a harm to those people, then altering photos of real children is certainly also a harm to those children. |
| |
| ▲ | whamlastxmas a day ago | parent [-] | | I'm strongly against CSAM but I will say this analogy doesn't quite hold (though the values behind it does) Libel must be as assertion that is not true. Photoshopping or AIing someone isn't an assertion of something untrue. It's more the equivalent of saying "What if this is true?" which is perfectly legal | | |
| ▲ | cwillu a day ago | parent [-] | | “ 298 (1) A defamatory libel is matter published, without lawful justification or excuse, that is likely to injure the reputation of any person by exposing him to hatred, contempt or ridicule, or that is designed to insult the person of or concerning whom it is published. Marginal note:Mode of expression
(2) A defamatory libel may be expressed directly or by insinuation or irony
(a) in words legibly marked on any substance; or
(b) by any object signifying a defamatory libel otherwise than by words.”
It doesn't have to be an assertion, or even a written statement. | | |
| ▲ | 93po 21 hours ago | parent [-] | | You're quoting Canadian law. In the US it varies by state but generally requires: A false statement of fact (not opinion, hyperbole, or pure insinuation without a provably false factual core). Publication to a third party. Fault Harm to reputation ---- In the US it is required that it is written (or in a fixed form). If it's not written (fixed), it's slander, not libel. | | |
| ▲ | direwolf20 3 hours ago | parent | next [-] | | Pictures are statement of fact: what is depicted exists. Naked pictures cause harm to reputation | |
| ▲ | cwillu 20 hours ago | parent | prev [-] | | The relevant jurisdiction isn't the US either. |
|
|
|
|
|
| ▲ | chrisjj a day ago | parent | prev | next [-] |
| > The point of banning real CSAM is to stop the production of it, because the production is inherently harmful. The production of AI or human generated CSAM-like images does not inherently require the harm of children, so it's fundamentally a different consideration. Quite. > That's why some countries, notably Japan, allow the production of hand-drawn material that in the US would be considered CSAM. Really? By what US definition of CSAM? https://rainn.org/get-the-facts-about-csam-child-sexual-abus... "Child sexual abuse material (CSAM) is not “child pornography.” It’s evidence of child sexual abuse—and it’s a crime to create, distribute, or possess. " |
| |
|
| ▲ | tokai a day ago | parent | prev | next [-] |
| That's not what we are discussing here. Even less when a lot of the material here is edits of real pictures. |
|
| ▲ | duckbilled2 a day ago | parent | prev [-] |
| [dead] |