| ▲ | krig 6 hours ago | |
Those images are generated from a training set, and it is already well known and reported that those training sets contain _real_ CSAM, real violence, real abuse. That "generated" face of a child is based on real images of real children. | ||
| ▲ | pavlov 5 hours ago | parent [-] | |
Indeed, a Stanford study from a few years back showed that the image data sets used by essentially everybody contain CSAM. Everybody else has teams building guardrails to mitigate this fundamental existential horror of these models. Musk fired all the safety people and decided to go all in on “adult” content. | ||