Remix.run Logo
red_admiral 3 hours ago

The amount of CSAM online is completely out of control. There's already nation-level and sometimes international cooperation to catch any known images with perceptual hashing (think: the opposite of cryptographic hashing) as well as other automated and manual tools.

My impression is it would take Manhattan-Project levels of effort and funds to come close to "solving" this problem, especially without someone getting on a watchlist for having a telehealth-first primary care provider insurace plan and asking for advice on their toddler's chickenpox.

Human review? Meta has small armies worth of content moderators already that tend to burn out with psychological problems and have a suicide rate where you're probably better off going to fight in a real war. (This includes workers hired by Sama in Kenya, to link back to the OP.)

I will reluctantly grant Meta that they're up against a really hard problem here.

freejazz 3 hours ago | parent [-]

>I will reluctantly grant Meta that they're up against a really hard problem here.

It is a problem of their own making.

bpt3 2 hours ago | parent [-]

They created the concept of CSAM?

freejazz 14 minutes ago | parent [-]

No, being so large that it's such a problem for them.

bpt3 10 minutes ago | parent [-]

Seems like your blame is quite misplaced.

freejazz a minute ago | parent [-]

It certainly is not.