| ▲ | joe_mamba 4 hours ago | |||||||||||||||||||||||||||||||
>You can’t just “undo” some girl being harassed by AI generated nude photos of her, so we… No, but if you send those people who made and distributed the AI nude of her to jail, these problems will virtually disappear overnight, because going to jail is a hugely effective deterrent for most people. But if you don't directly prosecute the people doing it, and instead just ban Grok AI, then those people will just use other AI tools, outside of US jurisdiction, to do the same things and the problem persists. And the issues keeps persisting, because nobody ever goes to jail. Everyone only gets a slap on the wrist, deflects accountability by blaming the AI, so the issue keeps persisting and more people end up getting hurt because those who do the evil are never held directly accountable. Obviously Grok shouldn't be legally allowed to generate fakes nudes of actual kids, but in case such safeguards can and will be bypassed, that doesn't absolve the humans from being the ones knowingly breaking the law to achieve a nefarious goal. | ||||||||||||||||||||||||||||||||
| ▲ | Cthulhu_ 8 minutes ago | parent | next [-] | |||||||||||||||||||||||||||||||
> No, but if you send those people who made and distributed the AI nude of her to jail, these problems will virtually disappear overnight, because going to jail is a hugely effective deterrent for most people. Actually you'll see the opposite happen a lot - after Columbine, the number of school shootings went up [0] for example, because before people didn't consider it an option. Same with serial killers / copycats, and a bunch of other stuff. Likewise, if it hadn't been in the news, a lot of people wouldn't have known you can / could create nudes of real people with Grok. News reporting on these things is its own kind of unfortunate marketing, and for every X people that are outraged about this, there will be some that are instead inspired and interested. While a lot of punishments for crimes is indeed a deterrent, it doesn't always work. Also because in this case, it's relatively easy to avoid being found out (unlike school shootings). [0] https://www.security.org/blog/a-timeline-of-school-shootings... | ||||||||||||||||||||||||||||||||
| ▲ | pseudony 4 hours ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||
That’s just not how the world works. Youths lack judgment, so they can’t vote, drink, drive, have sex or consent to adults. A 14-year-old can’t be relied to understand the consequences of making nudes of some girl. Beyond that, we regulate guns, speed limits and more according to principles like “your right to swing your fist ends at my nose”. We do that not only because shoving kids into jails is something we want to avoid, but because regulating at the source of the problem is both more feasible AND heads off a lot of tragedy. And again, you fail to acknowledge the investigative burden you put on society to discover who originated the photo after the fact, and the trauma to the victim. If none of that computes for you, then I don’t know what to say except I don’t place the right to generate saucy images highly enough to swarm my already overworked police with requests to investigate who generated fake underage porn. | ||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||
| ▲ | verdverm 28 minutes ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||
> And the issues keeps persisting, because nobody ever goes to jail. Yes, let's just jail every kid who makes a mistake, ya know, instead of the enablers who should know better as adults... except for that one guy, let's put him in the white house | ||||||||||||||||||||||||||||||||
| ▲ | tecoholic 3 hours ago | parent | prev [-] | |||||||||||||||||||||||||||||||
The way you are arguing makes it really hard to understand what you are trying to say. I am guessing you are upset that non-human entity is being used as a boogie man while the actual people are going free? But your argumentation reads like someone who is very upset at AI producing CSAM is being persecuted. I won’t be surprised if people think you are defending CSAM. In good faith, a few things - AI generated imagery and Photoshop are not the same. If someone can mail Adobe and a photo of a kid and ask for a modified one and Adobe sent it back, yes Adobe’s offices will be raided. That’s the equivalent here. It’s not a tool. It’s a service. You keep using AI, without taking a moment to give the “intelligence” any thought. Yes, powerful people are always going to get by, as you say. And the laws & judicial system are for the masses. There is definitely unfairness in it. But that doesn’t change anything here - this is a separate conversation. If not Grok then someone else will do it - is a defeatist argument that can only mean it can’t be controlled so don’t bother. This point is where you come across as a CSAM defender. Govt’s will/should do whatever they can to make society safe, even if it means playing whack a mole. Arguing that’s “not efficient” is frankly confusing. Judicial system is about fairness and not efficiency. frankly, I think you understand all of this and maybe got tunnel visioned in your anger at the unfairness of people scapegoating technology for its failings. That’s the last thing I want to point out, raiding an office is taking action against the powerful people who build systems without accountability. They are not going to sit the model down and give a talking to. The intention is to identify the responsible party that allows this to happen. | ||||||||||||||||||||||||||||||||