Remix.run Logo
IncreasePosts 4 hours ago

How is matching images against known hashes of child porn enabling control, censorship, and data harvesting?

whatshisface 3 hours ago | parent | next [-]

It is like letting a policeman into your house to make sure you are not committing crimes. The methods (installing an AI module behind your defenses against criminal hackers that is programmed to betray you) are too invasive.

ceejayoz 3 hours ago | parent | prev | next [-]

Because at some point someone in power puts the JD Vance meme that was going around in as a hash.

iamnothere an hour ago | parent [-]

Or leaks related to national security failures/coverups or exposing corruption. Or copyright infringement.

exyi 3 hours ago | parent | prev | next [-]

Same tool is very handy if you hypothetically wanted to control spread of anything else, like anti ice apps for instance.

Also hash matching is so easily bypassed you can be sure they really want to add some "AI" detector as well

gruez 3 hours ago | parent [-]

>Same tool is very handy if you hypothetically wanted to control spread of anything else, like anti ice apps for instance.

That's a weak argument because they can already do that today with google's play protect and apple's app notarization.

eqvinox 3 hours ago | parent | prev | next [-]

> matching images against known hashes

That's not how that works, last I checked. AIUI it's much more fuzzy. Has to be, being scum doesn't automatically make you an idiot, and a single bit change would make plain old hashes entirely useless.

Insert your favourite dystopia to see where that ends up and how companies benefit from it.

raverbashing 3 hours ago | parent | prev [-]

I'd give it that matching hashes is probably the least worse way of going about this

Except for that pesky detail of hash collisions