| ▲ | joe_mamba 5 hours ago |
| >And in fact the girl who was being targeted was expelled not the boys who did it [1]. And the AI is at fault for this sentencing, not the school authorities/prosecutors/judges dishing justice? WTF. How is this an AI problem and not a legal system problem? |
|
| ▲ | pseudony 4 hours ago | parent | next [-] |
| You can’t “undo” a school shooting, for instance, so we tend to have gun laws. You can’t just “undo” some girl being harassed by AI generated nude photos of her, so we… Yes, we should have some protections or restrictions on what you can do. You may not understand it, either because you aren’t a parent or maybe just not emotionally equipped to understand how serious this actually can be, but your lack of comprehension does not render it a non-issue. Having schools play whack-a-mole after the photos are shared around is not a valid strategy. Never mind that schools primarily engage in teaching, not in investigation. As AI-generated content gets less and less distinguishable from reality, these incidents will have far worse consequences and putting such power in the hands of adolescents who demonstrably don’t have sound judgment (hence why they lack many other rights that adults have) is not something most parents are comfortable with - and I doubt you’ll find many teachers, psychiatrists and so on who would support your approach either. |
| |
| ▲ | joe_mamba 4 hours ago | parent [-] | | >You can’t just “undo” some girl being harassed by AI generated nude photos of her, so we… No, but if you send those people who made and distributed the AI nude of her to jail, these problems will virtually disappear overnight, because going to jail is a hugely effective deterrent for most people. But if you don't directly prosecute the people doing it, and instead just ban Grok AI, then those people will just use other AI tools, outside of US jurisdiction, to do the same things and the problem persists. And the issues keeps persisting, because nobody ever goes to jail. Everyone only gets a slap on the wrist, deflects accountability by blaming the AI, so the issue keeps persisting and more people end up getting hurt because those who do the evil are never held directly accountable. Obviously Grok shouldn't be legally allowed to generate fakes nudes of actual kids, but in case such safeguards can and will be bypassed, that doesn't absolve the humans from being the ones knowingly breaking the law to achieve a nefarious goal. | | |
| ▲ | Cthulhu_ 6 minutes ago | parent | next [-] | | > No, but if you send those people who made and distributed the AI nude of her to jail, these problems will virtually disappear overnight, because going to jail is a hugely effective deterrent for most people. Actually you'll see the opposite happen a lot - after Columbine, the number of school shootings went up [0] for example, because before people didn't consider it an option. Same with serial killers / copycats, and a bunch of other stuff. Likewise, if it hadn't been in the news, a lot of people wouldn't have known you can / could create nudes of real people with Grok. News reporting on these things is its own kind of unfortunate marketing, and for every X people that are outraged about this, there will be some that are instead inspired and interested. While a lot of punishments for crimes is indeed a deterrent, it doesn't always work. Also because in this case, it's relatively easy to avoid being found out (unlike school shootings). [0] https://www.security.org/blog/a-timeline-of-school-shootings... | |
| ▲ | pseudony 4 hours ago | parent | prev | next [-] | | That’s just not how the world works. Youths lack judgment, so they can’t vote, drink, drive, have sex or consent to adults. A 14-year-old can’t be relied to understand the consequences of making nudes of some girl. Beyond that, we regulate guns, speed limits and more according to principles like “your right to swing your fist ends at my nose”. We do that not only because shoving kids into jails is something we want to avoid, but because regulating at the source of the problem is both more feasible AND heads off a lot of tragedy. And again, you fail to acknowledge the investigative burden you put on society to discover who originated the photo after the fact, and the trauma to the victim. If none of that computes for you, then I don’t know what to say except I don’t place the right to generate saucy images highly enough to swarm my already overworked police with requests to investigate who generated fake underage porn. | | |
| ▲ | joe_mamba 3 hours ago | parent [-] | | >A 14-year-old can’t be relied to understand the consequences of making nudes of some girl. Teenagers do stupid shit all the time. But they still get prosecuted or convicted when they do crimes. They go to juvy or their parents get punished. Being 14 is not a get out of jail free card. |
| |
| ▲ | verdverm 27 minutes ago | parent | prev | next [-] | | > And the issues keeps persisting, because nobody ever goes to jail. Yes, let's just jail every kid who makes a mistake, ya know, instead of the enablers who should know better as adults... except for that one guy, let's put him in the white house | |
| ▲ | tecoholic 3 hours ago | parent | prev [-] | | The way you are arguing makes it really hard to understand what you are trying to say. I am guessing you are upset that non-human entity is being used as a boogie man while the actual people are going free? But your argumentation reads like someone who is very upset at AI producing CSAM is being persecuted. I won’t be surprised if people think you are defending CSAM. In good faith, a few things - AI generated imagery and Photoshop are not the same. If someone can mail Adobe and a photo of a kid and ask for a modified one and Adobe sent it back, yes Adobe’s offices will be raided. That’s the equivalent here. It’s not a tool. It’s a service. You keep using AI, without taking a moment to give the “intelligence” any thought. Yes, powerful people are always going to get by, as you say. And the laws & judicial system are for the masses. There is definitely unfairness in it. But that doesn’t change anything here - this is a separate conversation. If not Grok then someone else will do it - is a defeatist argument that can only mean it can’t be controlled so don’t bother. This point is where you come across as a CSAM defender. Govt’s will/should do whatever they can to make society safe, even if it means playing whack a mole. Arguing that’s “not efficient” is frankly confusing. Judicial system is about fairness and not efficiency. frankly, I think you understand all of this and maybe got tunnel visioned in your anger at the unfairness of people scapegoating technology for its failings. That’s the last thing I want to point out, raiding an office is taking action against the powerful people who build systems without accountability. They are not going to sit the model down and give a talking to. The intention is to identify the responsible party that allows this to happen. |
|
|
|
| ▲ | sam-cop-vimes 4 hours ago | parent | prev | next [-] |
| You cannot offload all problems to the legal system. It does not have the capacity. Legal issues take time to resolve and the victims have to have the necessary resource to pursue legal action. Grok enabled abuse at scale, which no legal system in the world can keep up with. It doesn't need explanation that generating nudes of people without their consent is a form of abuse. And if the legal system cannot keep up with protecting victims, the problem has to be dealt with at source. |
| |
| ▲ | joe_mamba 4 hours ago | parent [-] | | >You cannot offload all problems to the legal system. It does not have the capacity. You definitely can. You don't have to prosecute and send a million people to jail for making and distributing fake AI nudes, you just have to send a couple, and then the problem virtually goes away. People underestimate how effective direct personal accountability is when it comes with harsh consequences like jail time. That's how you fix all issues in society and enforce law abiding behavior. You make the cost of the crime greater than the gains from it, then crucify some people in public to set an example for everyone else. Do people like doing and paying their taxes? No, but they do it anyway. Why is that? Because THEY KNOW that otherwise they go to jail. Obviously the IRS and legal system don't have the capacity to send the whole country to jail if they were to stop paying taxes, but they send enough to jail in order for the majority of the population to not risk it and follow the law. It's really that simple. | | |
| ▲ | TheOtherHobbes 3 hours ago | parent | next [-] | | None of what you've said is true. Deterrence is known to have a very limited effect on behaviour. In this case, it's far simpler to prosecute the source. | | |
| ▲ | soderfoo 25 minutes ago | parent | next [-] | | Increased severity of punishment has little deterrent effect, both individually and generally. The certainty or likelihood of being caught if a far more effevtive deterrent, but require effort, focus, and resources by law enforcement. It's a resource constraint problem and a policy choice. If "they" wanted to set the tone that this type of behavior will not be tolerated, it would require a concerted multi agency surge of investigative and prosecutorial resources. It's been done before, if there's a will there's a way. | |
| ▲ | joe_mamba 3 hours ago | parent | prev [-] | | >None of what you've said is true. Everything I said is true. >Deterrence is known to have a very limited effect on behaviour. It is insanely effective when actually enforced. It's not effective when the goal is to make it seem ineffective so that people can evade the system. >In this case, it's far simpler to prosecute the source. The "source" is a tool that tomorrow can be in Russia or CHina and you can't prosecute. |
| |
| ▲ | panda-giddiness 2 hours ago | parent | prev | next [-] | | > People underestimate how effective direct personal accountability is when it comes with harsh consequences like jail time. That's how you fix all issues in society and enforce law abiding behavior. You make the cost of the crime greater than the gains from it, then crucify some people in public to set an example for everyone else And yet criminals still commit crimes. Obviously jail is not the ultimate deterrent you think it is. Nobody commits crimes with the expectation that they'll get caught, and if you only "crucify some people", then most criminals are going to (rightfully) assume that they'll be one of the lucky ones. | |
| ▲ | everettp 3 hours ago | parent | prev | next [-] | | Actually research shows people regularly overestimate how effective deterrence-based punishment is. Particularly for children and teenagers. How many 14-year-olds do you really think are getting prosecuted and sent to jail for asking Grok to generate a nude of their classmate..? How many 14-year-olds are giving serious thought about their long-term future in the moment they are typing a prompt into to Twitter..? Your argument is akin to suggesting that carmakers should sell teenagers cars to drive, because the teenager can be punished if they cause an accident. | |
| ▲ | ljm 2 hours ago | parent | prev [-] | | > You don't have to prosecute and send a million people to jail for making and distributing fake AI nudes, you just have to send a couple, and then the problem virtually goes away. I genuinely cannot tell if you are being comically naïve or extremely obtuse here. You need only look at the world around you to see that this does not, and never will, happen. As another commenter said, this argument is presenting itself as apologia for CSAM and you come across as a defender of the right for a business to create and publish it. I assume you don't actually believe that, but the points you made are compatible. It is as much the responsibility of a platform for providing the services to create illegal material, and also distributing said illegal material. That it happens to be an AI that generates the imagery is not relevant - X and Grok are still the two services responsible for producing and hosting it. Therefore, the accountability falls on those businesses and its leadership just as much as it does the individual user, because ultimately they are facilitating it. To compare to other situations: if a paedophile ring is discovered on the dark web, the FBI doesn't just arrest the individuals involved and leave the website open. It takes the entire thing down including those operating it, even if they themselves were simply providing the server and not partaking in the content. |
|
|
|
| ▲ | anonymous908213 5 hours ago | parent | prev | next [-] |
| Have you considered that it is possible for two things to be problems? |
| |
| ▲ | joe_mamba 5 hours ago | parent [-] | | No, because the comment is in bad faith, it just introduced an unrelated issue (poor sentencing from authorities) as an argument for the initial issue we are discussing (AI nudes), derailing the conversation, and then using the new issue they themselves introduced to legitimize their poor argument when one has nothing to do with the other and both can be good/bad independently of each other. I don't accept this as good faith argumentation nor does HN rules. | | |
| ▲ | Phelinofist 4 hours ago | parent | next [-] | | You are the only one commenting in bad faith, by refusing to understand/acknowledging that the people using Grok to create such pictures AND Grok are both part of the issue. It should not be possible to create nudes of minors via Grok. Full stop. | | |
| ▲ | joe_mamba 4 hours ago | parent [-] | | >You are the only one commenting in bad faith For disagreeing on the injection of offtopic hypothetical scenarios as an argument derailing the main topic? >It should not be possible to create nudes of minors via Grok. I agree with THIS part, I don't agree with the part where the main blame is on the AI, instead of on the people using it. That's not a bad faith argument, it's just My PoV. If Grok disappears tomorrow, there will be other AIs from other parts of the world outside of US/EU jurisdiction, that will do the same since the cat is out of the bag and the technical barrier to entry is dropping fast. Do you keep trying to whack-a-mole the AI tools for this, or the humans actually making and distributing fake nudes of real people? |
| |
| ▲ | Bluescreenbuddy 2 hours ago | parent | prev [-] | | Then log off. |
|
|
|
| ▲ | _pdp_ 3 hours ago | parent | prev | next [-] |
| You know there is no such thing as the world police or something of that sort. If the perpetrator is in another country / jurisdiction it is virtually impossible to prosecute let alone sentence. It is 100% regulatory problem in this case. You just cannot allow this content to be generated and distributed in the public domain by anonymous users. It has nothing to do with free speech but with civility and common understanding of what is morally wrong / right. Obviously you cannot prevent this in private forums unless it is made illegal which is a completely different problem that requires a very different solution. |
|
| ▲ | lukan 5 hours ago | parent | prev [-] |
| Grok made the pictures. The school authorities messed up. Both are accuntable. |
| |
| ▲ | joe_mamba 5 hours ago | parent [-] | | >Grok made the pictures. Correction: kids made the pictures. Using Grok as the tool. If kids were to "git gud" at photoshop and use that to make nudes, would you arrest Adobe? | | |
| ▲ | defrost 5 hours ago | parent | next [-] | | In the spirit of shitty "If's ..." If kids ask a newspaper vendor for cigarettes and he provides them .. that's a no-no. If kids ask a newspaper vendor for nudes and he provides them .. that's a no-no. If kids ask Grok for CSAM and it provides them .. then ? | | |
| ▲ | joe_mamba 5 hours ago | parent | next [-] | | The existence and creation of cigarettes and adult nude magazines is fully legal, only their sale is illegal to kids. If kids try to illegally obtain those LEGAL items, it doesn't make the existence of those items illegal, just the act of sale to them. Meanwhile, the existence/creation CSAM of actual people isn't legal, for anyone no matter the age. | |
| ▲ | abc123abc123 4 hours ago | parent | prev [-] | | If parents or school let children play with explosives or do drugs and they get hurt, that's a no-no. If parents or school let children roam the internet unsupervised... then? |
| |
| ▲ | tene80i 5 hours ago | parent | prev | next [-] | | You're suggesting an inconsistency where there isn't one. A country can ban guns and allow rope, even though both can kill. | | |
| ▲ | joe_mamba 4 hours ago | parent [-] | | > A country can ban guns and allow rope, even though both can kill. That's actually a good argument. And that's how the UK ending up banning not just guns, but all sorts of swords, machetes and knives, meanwhile the violent crime rates have not dropped. So maybe dangerous knives are not the problem, but the people using them to kill other people. So then where do we draw the line between lethal weapons and crime correlation. At which cutting/shooting instruments? Same with software tools, that keep getting more powerful with time lowering the bar to entry for generating nudes of people. Where do we draw the line on which tools are responsible for that instead of the humans using them for it? |
| |
| ▲ | verdverm 31 minutes ago | parent | prev | next [-] | | it's surprising how far people will go to defend CSAM | |
| ▲ | lukan 5 hours ago | parent | prev [-] | | "Correction: kids made the pictures. Using Grok as the tool." No. That is not how AI nowdays works. Kids told the tool what they want and the tool understood and could have refused like all the other models - but instead it delivered. And it only could do so because it was specifically trained for that. "If kids were to "git gud" at photoshop " And what is that supposed to mean? Adobe makes general purpose tools as far as I know. |
|
|