| ▲ | pavlov 5 hours ago | |
Indeed, a Stanford study from a few years back showed that the image data sets used by essentially everybody contain CSAM. Everybody else has teams building guardrails to mitigate this fundamental existential horror of these models. Musk fired all the safety people and decided to go all in on “adult” content. | ||