| ▲ | exodust 6 hours ago |
| I remember when CSAM meant actual children not computer graphics. Should platforms allow violent AI images? How about "R-Rated" violence like we see in popular movies? Point blank executions, brutal and bloody conflict involving depictions of innocent deaths, torment and suffering... all good? Hollywood says all good, how about you? How far do you take your "unacceptable content" guidance? |
|
| ▲ | pjc50 4 hours ago | parent | next [-] |
| > How about "R-Rated" violence like we see in popular movies? Movie ratings are a good example of a system for restricting who sees unacceptable content, yes. |
| |
| ▲ | ascagnel_ 2 hours ago | parent [-] | | More to the point, now that most productions are using intimacy coordinators, there's a degree of certainty around the consent of R-rated images. There's basically no consent with what Grok is doing. | | |
| ▲ | guerrilla 12 minutes ago | parent [-] | | > There's basically no consent with what Grok is doing. Wait how do you get consent from people that don't exist? |
|
|
|
| ▲ | myrmidon 5 hours ago | parent | prev | next [-] |
| There are multiple valid reasons to fight realistic computer-generated CSAM content. Uncontrolled profileration of AI-CSAM makes detection of "genuine" data much harder, prosecution of perpetrators more difficult and specifically in many of the grok cases it harms young victims that were used as templates for the material. Content is unacceptable if its proliferation causes sufficient harm, and this is arguably the case here. |
| |
| ▲ | Eisenstein 4 hours ago | parent [-] | | > Uncontrolled profileration of AI-CSAM makes detection of "genuine" data much harder I don't follow. If the prosecutor can't find evidence of a crime and a person is not charged, that is considered harmful? As such the 5th amendment would fall under the same category and so would encryption. Making law enforcement have to work harder to find evidence of a crime cannot be criminalized unless you can come up with a reason why the actions themselves deserve to be criminalized. > specifically in many of the grok cases it harms young victims that were used as templates for the material. What is the criteria for this? If something is suitably transformed such that the original model for it is not discernable or identifiable, how can it harm them? Do not take these as an argument against the idea you are arguing for, but as rebuttals against arguments that are not convincing, or if they were, would be terrible if applied generally. | | |
| ▲ | myrmidon 3 hours ago | parent [-] | | If there is a glut of legal, AI generated CSAM material then this provides a lot of deniability for criminal creators/spreaders that cause genuine harm, and reduces "vigilance" of prosecutors, too ("it's probably just AI generated anyway..."). You could make a multitude of arguments against that perspective, but at least there is a conclusive reason for legal restrictions. > What is the criteria for this? My criteria would be victims suffering personally from the generated material. The "no harm" argument only really applies if victims and their social bubble never find out about the material (but that did happen, sometimes intentionally, in many cases). You could make the same argument that a hidden camera in a locker room never causes any harm as long as it stays undetected; that is not very convincing to me. | | |
| ▲ | guerrilla 10 minutes ago | parent | next [-] | | > If there is a glut of legal, AI generated CSAM material then this provides a lot of deniability for criminal creators/spreaders that cause genuine harm, and reduces "vigilance" of prosecutors, too ("it's probably just AI generated anyway..."). > You could make a multitude of arguments against that perspective, but at least there is a conclusive reason for legal restrictions. I don't know about that. Would "I didn't know it was real" really count as a legal defense? | | |
| ▲ | myrmidon 3 minutes ago | parent [-] | | > I don't know about that. Would "I didn't know it was real" really count as a legal defense? Absolutely-- prosecution would presumably need to at least show that you could have known the material was "genuine". This could be a huge legal boon for prosecuted "direct customers" and co-perpetrators that can only be linked via shared material. |
| |
| ▲ | Eisenstein 2 hours ago | parent | prev [-] | | > You could make a multitude of arguments against that perspective, but at least there is a conclusive reason for legal restrictions. But that reason is highly problematic. Laws should be able to stand on their own for their reasons. Saying 'this makes enforcement of other laws harder' does not do that. You could use the same reasoning against encryption. > You could make the same argument that a hidden camera in a locker room never causes any harm as long as it stays undetected; that is not very convincing to me. I thought you were saying that the kids who were in the dataset that the model was trained on would be harmed. I agree with what I assume you meant based on your reply, which is people who had their likeness altered are harmed. | | |
| ▲ | myrmidon 2 hours ago | parent | next [-] | | > Saying 'this makes enforcement of other laws harder' does not do that. You could use the same reasoning against encryption. Yes. I almost completely agree with your outlook, but I think that many of our laws trade such individual freedoms for better society-wide outcomes, and those are often good tradeoffs. Just consider gun legislation, driving licenses, KYC laws in finance, etc: Should the state have any business interfering there? I'd argue in isolation (ideally) not; but all those lead to huge gains for society, making it much less likely to be murdered by intoxicated drivers (or machine-gunners) and limit fraud, crime and corruption. So even if laws look kinda bad from a purely theoretical-ethics point of view it's still important to look at the actual effects that they have before dismissing them as unjust in my view. | |
| ▲ | direwolf20 an hour ago | parent | prev [-] | | Laws against money laundering come to kind. It's illegal for you to send money from your legal business to my personal account and for me to send it from my personal account to your other legal business, not because the net result is illegal, but because me being in the middle makes it harder for "law enforcement" to trace the transaction. |
|
|
|
|
|
| ▲ | KaiserPro 5 hours ago | parent | prev | next [-] |
| > I remember when CSAM meant actual children not computer graphics. The "oh its photoshop" defence was an early one, which required the law to change in the uk to be "depictions" of children, so that people who talk about ebephiles don't have an out for creating/distributing illegal content. |
| |
| ▲ | master-lincoln 2 hours ago | parent [-] | | There still needs to be sexual abuse depicted, no? Just naked kids should not be an issue, right? | | |
| ▲ | paintbox an hour ago | parent | next [-] | | If I found a folder with a hundred images of naked kids on your PC, I would report you to authorities, regardless of what pose kids are depicted in. So I guess the answer is no. | |
| ▲ | direwolf20 an hour ago | parent | prev [-] | | Naked kid pictures intended for sexual gratification are illegal in most countries | | |
| ▲ | master-lincoln 20 minutes ago | parent [-] | | Hard to know the intent of a picture in most cases.
E.g. there used to be a magazine for teens when I grew up showing a picture of a naked adolescent of each sex in every edition (Bravo Dr Sommer).
The intent was to educate teens and to make them feel less ashamed.
I bet there were people who used these for sexual gratification. Should that have been a reason to ban them? I don't think so. |
|
|
|
|
| ▲ | SecretDreams 17 minutes ago | parent | prev | next [-] |
| Is this post low-key advocating for anime csam in the name of freedom? |
|
| ▲ | 4 hours ago | parent | prev | next [-] |
| [deleted] |
|
| ▲ | thrance 3 hours ago | parent | prev | next [-] |
| Ok, imagine your mom, sister or daughter is using X. Some random guy with an anime profile picture and a neonazi bio comes in, asks grok to make a picture of them in bikini for the whole world to see, and the bot obliges. Do you see the issue now? Because that happened to literally millions of people last month. |
| |
| ▲ | master-lincoln 2 hours ago | parent | next [-] | | A generated picture of a family member in a bikini is an issue? I don't see it... | | |
| ▲ | mnewme 2 hours ago | parent [-] | | Because you are apparently a man and have never been harassed in your life | | |
| ▲ | master-lincoln 2 hours ago | parent [-] | | I have been harassed. Women in bikinis are normal where I live. | | |
| ▲ | mnewme 2 hours ago | parent | next [-] | | There is a difference between running around in a bikini and people creating sexy pictures of yourself without consent. You do understand that? | | |
| ▲ | master-lincoln 28 minutes ago | parent [-] | | I do understand that, but in this thread there was no mention of anything sexy so far.
I am just pedantic because I think it is important when it comes to criminal accusations.
Sexuality and nakedness are two different things. | | |
| ▲ | mnewme 13 minutes ago | parent [-] | | Then check what the discussion is about, this is not about some funny ai pics | | |
| ▲ | master-lincoln 5 minutes ago | parent [-] | | It started with CSAM but then derailed into people in bikinis ¯\_(ツ)_/¯ We already have laws against distributing CSAM. Let's jail the X CEO then |
|
|
| |
| ▲ | b40d-48b2-979e 2 hours ago | parent | prev [-] | | People like you are why women choose the bear. |
|
|
| |
| ▲ | mnewme 2 hours ago | parent | prev [-] | | Exactly! This should not be ok |
|
|
| ▲ | cess11 5 hours ago | parent | prev | next [-] |
| [flagged] |
|
| ▲ | mnewme 5 hours ago | parent | prev [-] |
| What the hell? As a father there shouldn’t be any CSAM content anywhere. And think about that it is already proven these models apparently had CSAM content in their training data. Also what about the nudes of actual people? That is invasion of privacy I am shocked that we are even discussing this. |