Remix.run Logo
chrisjj a day ago

> Are you implying that it's not abuse to "undress" a child using AI?

Not at all. I am saying just it is not CSAM.

> You should realize that children have committed suicide before because AI deepfakes of themselves have been spread around schools.

Its terrible. And when "AI"s are found spreading deepfakes around schools, do let us know.

mrtksn 18 hours ago | parent | next [-]

CSAM: Child Sexual Abuse Material.

When you undress a child with AI, especially publicly on Twitter or privately through DM, that child is abused using the material the AI generated. Therefore CSAM.

chrisjj 12 hours ago | parent [-]

> When you undress a child with AI,

I guess you mean pasting a naked body on a photo of a child.

> especially publicly on Twitter or privately through DM, that child is abused using the material the AI generated.

In which country is that?

Here in UK, I've never heard of anyone jailed for doing that. Whereas many are for making actual child sexual abuse material.

enaaem 10 hours ago | parent | prev [-]

[flagged]