Remix.run Logo
txrx0000 6 hours ago

They have plausible deniability, but the fact of the matter is: this also erases evidence of past crimes from public records. If bad things already happened then we should keep the evidence that they happened.

The root problem of CSAM is child trafficking and abuse in physical space. But for whatever reason enforcement efforts seem to be more focused on censoring and deleting the images rather than on curbing the actual act of child trafficking and rape. It's almost as if viewing (or this case, merely archiving) CSAM is considered a worse crime than the physical act of trafficking and sexually abusing children, which is apparently okay nowadays if you're rich or powerful enough.

lostlogin 4 hours ago | parent | next [-]

> The root problem of CSAM is child trafficking and abuse in physical space. But for whatever reason enforcement efforts seem to be more focused on censoring and deleting the images rather than on curbing the actual act of child trafficking and rape.

Things get a bit uncomfortable for various high profile figures, political leaders and royalty if prosecutions start happening.

4 hours ago | parent | prev | next [-]
[deleted]
5 hours ago | parent | prev | next [-]
[deleted]
supriyo-biswas 5 hours ago | parent | prev | next [-]

A middle ground solution is for the admins to block the page with a message like "this page is unavailable due to reports of illegal content. if you work for a law enforcement agency and are considering using this as evidence, please contact us" for the preservation aspect.

The meta conspiracy theory in all of this would be that this is an actual CSAM producer trying to take down evidence that could be used against them.

vintermann 3 hours ago | parent [-]

It's very likely that it's someone trying to take down evidence, and since they have CSAM to upload, they would be in deep legal trouble themselves if they were identified.

It is however not at all clear the evidence they want scrubbed from the internet is CSAM-related. It's just the go-to tool for giving a site trouble for some attackers.

mpalmer 5 hours ago | parent | prev | next [-]

Great point. I guess one is just a better anti-privacy boogeyman.

asmor 2 hours ago | parent [-]

It's also a great (VC-funded) business opportunity to become the technology provider of such action. There are a few of these non-profit fronts with "technology partners" behind them that are lobbying for legislation like the UK Online Safety Act or Chat Control. Thorn is the most well-known one, but one particularly interesting one is SafeToNet, who after not getting a government contract for CSAM scanning (and purging their marketing for it from the web - you can still find it under the name SafeToWatch) have pivoted to just selling a slightly altered version of their app preloaded on a $200 smartphone to concerned parents - with a 2.5x price premium.

https://harmblock.com/

https://www.gsmarena.com/hmd_fuse_debuts_with_harmblock_ai_t...

Lammy an hour ago | parent | prev [-]

The people who make their living Caring A Lot would be out of a job without a constant fresh supply of things to be very concerned about.