Remix.run Logo
toxik 8 hours ago

[flagged]

krig 7 hours ago | parent | next [-]

Those images are generated from a training set, and it is already well known and reported that those training sets contain _real_ CSAM, real violence, real abuse. That "generated" face of a child is based on real images of real children.

pavlov 7 hours ago | parent [-]

Indeed, a Stanford study from a few years back showed that the image data sets used by essentially everybody contain CSAM.

Everybody else has teams building guardrails to mitigate this fundamental existential horror of these models. Musk fired all the safety people and decided to go all in on “adult” content.

7 hours ago | parent | prev | next [-]
[deleted]
hdgvhicv 7 hours ago | parent | prev | next [-]

> Uh, let's distinguish between generated images, however revolting, and actual child sexual abuse.

If you want. In many countries the law doesn’t. If you don’t like the law your billion dollar company still has to follow it. At least in theory.

KaiserPro 5 hours ago | parent | prev | next [-]

> let's distinguish between generated images, however revolting, and actual child sexual abuse.

Can't because even before GenAI the "oh its generated in photoshop" or "they just look young" excuse was used successfully to allow a lot of people to walk free. the law was tightend in the early 2000s for precisely this reason

blipvert 6 hours ago | parent | prev [-]

Pro-tip: if you are actively assisting someone in doing illegal things then you are an accomplice.