Remix.run Logo
Eisenstein 2 hours ago

> Uncontrolled profileration of AI-CSAM makes detection of "genuine" data much harder

I don't follow. If the prosecutor can't find evidence of a crime and a person is not charged, that is considered harmful? As such the 5th amendment would fall under the same category and so would encryption. Making law enforcement have to work harder to find evidence of a crime cannot be criminalized unless you can come up with a reason why the actions themselves deserve to be criminalized.

> specifically in many of the grok cases it harms young victims that were used as templates for the material.

What is the criteria for this? If something is suitably transformed such that the original model for it is not discernable or identifiable, how can it harm them?

Do not take these as an argument against the idea you are arguing for, but as rebuttals against arguments that are not convincing, or if they were, would be terrible if applied generally.

myrmidon an hour ago | parent [-]

If there is a glut of legal, AI generated CSAM material then this provides a lot of deniability for criminal creators/spreaders that cause genuine harm, and reduces "vigilance" of prosecutors, too ("it's probably just AI generated anyway...").

You could make a multitude of arguments against that perspective, but at least there is a conclusive reason for legal restrictions.

> What is the criteria for this?

My criteria would be victims suffering personally from the generated material.

The "no harm" argument only really applies if victims and their social bubble never find out about the material (but that did happen, sometimes intentionally, in many cases).

You could make the same argument that a hidden camera in a locker room never causes any harm as long as it stays undetected; that is not very convincing to me.

Eisenstein 43 minutes ago | parent [-]

> You could make a multitude of arguments against that perspective, but at least there is a conclusive reason for legal restrictions.

But that reason is highly problematic. Laws should be able to stand on their own for their reasons. Saying 'this makes enforcement of other laws harder' does not do that. You could use the same reasoning against encryption.

> You could make the same argument that a hidden camera in a locker room never causes any harm as long as it stays undetected; that is not very convincing to me.

I thought you were saying that the kids who were in the dataset that the model was trained on would be harmed. I agree with what I assume you meant based on your reply, which is people who had their likeness altered are harmed.

myrmidon 3 minutes ago | parent [-]

> Saying 'this makes enforcement of other laws harder' does not do that. You could use the same reasoning against encryption.

Yes. I almost completely agree with your outlook, but I think that many of our laws trade such individual freedoms for better society-wide outcomes, and those are often good tradeoffs.

Just consider gun legislation, driving licenses, KYC laws in finance, etc: Should the state have any business interfering there? I'd argue in isolation (ideally) not; but all those lead to huge gains for society, making it much less likely to be murdered by intoxicated drivers (or machine-gunners) and limit fraud, crime and corruption.

So even if laws look kinda bad from a purely theoretical-ethics point of view it's still important to look at the actual effects that they have before dismissing them as unjust in my view.