▲ | p_l 3 days ago | |
We have already experience with how false positives are skewed in practice, even case goes all the way to court. Because ostensibly good people do not want to see the CSAM material, they believe what algorithm/first reporter stated, and ofc nobody "good" wants to let a pedophile go free. And so the algorithm tries to hang a parent for making photo of skin rash to send to doctor (happened with Google Drive scanning) or a grandparent for having a photo of their toddler grandkids playing in kiddy pool (happened in UK, computer technician happened upon the photo and reported to police, if not for lawyer insisting to actually verify the "CSAM material" the prosecution would not actually ever check what the photo was of) |