| ▲ | haritha-j 3 hours ago | |||||||
Also, if you've gone from zero to one of the biggest coroporations in the country, and have billions to throw at the 'metaverse', I find it hard to believe that removing CSAM is where you struggle. | ||||||||
| ▲ | abdullahkhalids 2 hours ago | parent | next [-] | |||||||
No. It's a legitimately difficult problem because there not all naked pictures of kids are illegal. The false positive problem is bad for business, but also generally bad even if the big social media was benevolent. Moderators need to actually understand the context of the picture/video, which requires knowledge of culture and language of the people sharing the pictures. It's really difficult to do that without hiring moderators from every culture in the world. But small federated servers can often align along real world human social networks, so it's easier for the server admin to understand what should be removed. | ||||||||
| ▲ | red_admiral an hour ago | parent | prev | next [-] | |||||||
The amount of CSAM online is completely out of control. There's already nation-level and sometimes international cooperation to catch any known images with perceptual hashing (think: the opposite of cryptographic hashing) as well as other automated and manual tools. My impression is it would take Manhattan-Project levels of effort and funds to come close to "solving" this problem, especially without someone getting on a watchlist for having a telehealth-first primary care provider insurace plan and asking for advice on their toddler's chickenpox. Human review? Meta has small armies worth of content moderators already that tend to burn out with psychological problems and have a suicide rate where you're probably better off going to fight in a real war. (This includes workers hired by Sama in Kenya, to link back to the OP.) I will reluctantly grant Meta that they're up against a really hard problem here. | ||||||||
| ||||||||
| ▲ | GrinningFool 2 hours ago | parent | prev [-] | |||||||
Isn't this more about disincentivizing the posting of it in the first place by increasing the chances of getting banned? Once you have to remove it, it's too late. | ||||||||