Remix.run Logo
PeterStuer 8 hours ago

[flagged]

mnewme 8 hours ago | parent | next [-]

That is not the same.

Correct comparison would be:

You provide a photo studio with an adjacent art gallery and allow people to shoot CSAM content there and then exhibit their work.

direwolf20 5 hours ago | parent | next [-]

And the sign out front says "X-Ray camera photographs anyone naked — no age limits!"

And the camera is pointing out the window so you can use it on strangers walking by.

There is a point in law where you make something so easy to misuse that you become liable for the misuse.

In the USA they have "attractive nuisance", like building a kid's playground on top of a pit of snakes. That's so obviously a dumb idea that you become liable for the snake–bitten kids — you can't save yourself by arguing that you didn't give the kids permission to use the playground, that it's on private property, that the kids should have seen the snakes, or that it's legal to own snakes. No, you set up a situation where people were obviously going to get hurt and you become liable for the hurt.

PeterStuer 8 hours ago | parent | prev [-]

Not knowing any better, and not having seen any of the alleged images, my default guess would be they used the exact same CSAM filtering pipeline already in place on X regardless of the origin of the submitted images.

mnewme 6 hours ago | parent [-]

They obviously didn’t really implement anything as you can find that content or involuntary nudes of other people, which is also an invasion of privacy, super easily

brnt 8 hours ago | parent | prev [-]

If the camera reliably inserts racist filters and the ballpen would add hurtful words to whatever you write, indeed, let them up their legal insurance.