Remix.run Logo
michaelt 3 days ago

Nobody's distributing a free dataset of child abuse, animal torture and terror beheading images, for obvious reasons.

There are some open-weights NSFW detectors [1] but even if your detector is 99.9% accurate, you still need an appeals/review mechanism. And someone's got to look at the appeals.

[1] https://github.com/yahoo/open_nsfw

mallowdram 3 days ago | parent | next [-]

All of this is so dystopian (flowers/beheadings) it makes K Dick look like a golden-age Hollywood musical. Are the engineers so unaware of the essential primate forces underneath this that cannot be sanitized from the events? You can unearth our extinction from this value dichotomy.

alasarmas 2 days ago | parent | prev [-]

I mean, yes, my assumption is there exists an image / video normalization algorithm that can be followed by hashing the normalized value. There’s a CSAM scanning tool that exists that I believe uses a similar approach