Remix.run Logo
culi 7 hours ago

It's not really different from how we treat any other platform that can host CSAM. I guess the main difference is that it's being "made" instead of simply "distributed" here

toxik 6 hours ago | parent [-]

Uh, let's distinguish between generated images, however revolting, and actual child sexual abuse.

The main problem with the image generators is that they are used to harass and smear people (and children...) Those were always illegal to do.

krig 6 hours ago | parent | next [-]

Those images are generated from a training set, and it is already well known and reported that those training sets contain _real_ CSAM, real violence, real abuse. That "generated" face of a child is based on real images of real children.

pavlov 5 hours ago | parent [-]

Indeed, a Stanford study from a few years back showed that the image data sets used by essentially everybody contain CSAM.

Everybody else has teams building guardrails to mitigate this fundamental existential horror of these models. Musk fired all the safety people and decided to go all in on “adult” content.

5 hours ago | parent | prev | next [-]
[deleted]
KaiserPro 4 hours ago | parent | prev | next [-]

> let's distinguish between generated images, however revolting, and actual child sexual abuse.

Can't because even before GenAI the "oh its generated in photoshop" or "they just look young" excuse was used successfully to allow a lot of people to walk free. the law was tightend in the early 2000s for precisely this reason

hdgvhicv 6 hours ago | parent | prev | next [-]

> Uh, let's distinguish between generated images, however revolting, and actual child sexual abuse.

If you want. In many countries the law doesn’t. If you don’t like the law your billion dollar company still has to follow it. At least in theory.

blipvert 4 hours ago | parent | prev [-]

Pro-tip: if you are actively assisting someone in doing illegal things then you are an accomplice.