Remix.run Logo
myrmidon 3 hours ago

If there is a glut of legal, AI generated CSAM material then this provides a lot of deniability for criminal creators/spreaders that cause genuine harm, and reduces "vigilance" of prosecutors, too ("it's probably just AI generated anyway...").

You could make a multitude of arguments against that perspective, but at least there is a conclusive reason for legal restrictions.

> What is the criteria for this?

My criteria would be victims suffering personally from the generated material.

The "no harm" argument only really applies if victims and their social bubble never find out about the material (but that did happen, sometimes intentionally, in many cases).

You could make the same argument that a hidden camera in a locker room never causes any harm as long as it stays undetected; that is not very convincing to me.

guerrilla 10 minutes ago | parent | next [-]

> If there is a glut of legal, AI generated CSAM material then this provides a lot of deniability for criminal creators/spreaders that cause genuine harm, and reduces "vigilance" of prosecutors, too ("it's probably just AI generated anyway...").

> You could make a multitude of arguments against that perspective, but at least there is a conclusive reason for legal restrictions.

I don't know about that. Would "I didn't know it was real" really count as a legal defense?

myrmidon 3 minutes ago | parent [-]

> I don't know about that. Would "I didn't know it was real" really count as a legal defense?

Absolutely-- prosecution would presumably need to at least show that you could have known the material was "genuine".

This could be a huge legal boon for prosecuted "direct customers" and co-perpetrators that can only be linked via shared material.

Eisenstein 2 hours ago | parent | prev [-]

> You could make a multitude of arguments against that perspective, but at least there is a conclusive reason for legal restrictions.

But that reason is highly problematic. Laws should be able to stand on their own for their reasons. Saying 'this makes enforcement of other laws harder' does not do that. You could use the same reasoning against encryption.

> You could make the same argument that a hidden camera in a locker room never causes any harm as long as it stays undetected; that is not very convincing to me.

I thought you were saying that the kids who were in the dataset that the model was trained on would be harmed. I agree with what I assume you meant based on your reply, which is people who had their likeness altered are harmed.

myrmidon 2 hours ago | parent | next [-]

> Saying 'this makes enforcement of other laws harder' does not do that. You could use the same reasoning against encryption.

Yes. I almost completely agree with your outlook, but I think that many of our laws trade such individual freedoms for better society-wide outcomes, and those are often good tradeoffs.

Just consider gun legislation, driving licenses, KYC laws in finance, etc: Should the state have any business interfering there? I'd argue in isolation (ideally) not; but all those lead to huge gains for society, making it much less likely to be murdered by intoxicated drivers (or machine-gunners) and limit fraud, crime and corruption.

So even if laws look kinda bad from a purely theoretical-ethics point of view it's still important to look at the actual effects that they have before dismissing them as unjust in my view.

direwolf20 an hour ago | parent | prev [-]

Laws against money laundering come to kind. It's illegal for you to send money from your legal business to my personal account and for me to send it from my personal account to your other legal business, not because the net result is illegal, but because me being in the middle makes it harder for "law enforcement" to trace the transaction.