▲ | alasarmas 3 days ago | |||||||||||||
It has been documented that human image moderators exist and that some have been deeply traumatized by their work. I have zero doubts that the datasets of content and metadata created by human image moderators are being bought and sold, literally trafficking in human suffering. Can you point to a comprehensive effort by the tech majors to create a freely-licensed dataset of violent content and metadata to prevent duplication of human suffering? | ||||||||||||||
▲ | michaelt 3 days ago | parent [-] | |||||||||||||
Nobody's distributing a free dataset of child abuse, animal torture and terror beheading images, for obvious reasons. There are some open-weights NSFW detectors [1] but even if your detector is 99.9% accurate, you still need an appeals/review mechanism. And someone's got to look at the appeals. | ||||||||||||||
|