| |
| ▲ | pseudony 3 hours ago | parent | next [-] | | You can’t “undo” a school shooting, for instance, so we tend to have gun laws. You can’t just “undo” some girl being harassed by AI generated nude photos of her, so we… Yes, we should have some protections or restrictions on what you can do. You may not understand it, either because you aren’t a parent or maybe just not emotionally equipped to understand how serious this actually can be, but your lack of comprehension does not render it a non-issue. Having schools play whack-a-mole after the photos are shared around is not a valid strategy. Never mind that schools primarily engage in teaching, not in investigation. As AI-generated content gets less and less distinguishable from reality, these incidents will have far worse consequences and putting such power in the hands of adolescents who demonstrably don’t have sound judgment (hence why they lack many other rights that adults have) is not something most parents are comfortable with - and I doubt you’ll find many teachers, psychiatrists and so on who would support your approach either. | | |
| ▲ | joe_mamba 2 hours ago | parent [-] | | >You can’t just “undo” some girl being harassed by AI generated nude photos of her, so we… No, but if you send those people who made and distributed the AI nude of her to jail, these problems will virtually disappear overnight, because going to jail is a hugely effective deterrent for most people. But if you don't directly prosecute the people doing it, and instead just ban Grok AI, then those people will just use other AI tools, outside of US jurisdiction, to do the same things and the problem persists. And the issues keeps persisting, because nobody ever goes to jail. Everyone only gets a slap on the wrist, deflects accountability by blaming the AI, so the issue keeps persisting and more people end up getting hurt because those who do the evil are never held directly accountable. Obviously Grok shouldn't be legally allowed to generate fakes nudes of actual kids, but in case such safeguards can and will be bypassed, that doesn't absolve the humans from being the ones knowingly breaking the law to achieve a nefarious goal. | | |
| ▲ | pseudony 2 hours ago | parent | next [-] | | That’s just not how the world works. Youths lack judgment, so they can’t vote, drink, drive, have sex or consent to adults. A 14-year-old can’t be relied to understand the consequences of making nudes of some girl. Beyond that, we regulate guns, speed limits and more according to principles like “your right to swing your fist ends at my nose”. We do that not only because shoving kids into jails is something we want to avoid, but because regulating at the source of the problem is both more feasible AND heads off a lot of tragedy. And again, you fail to acknowledge the investigative burden you put on society to discover who originated the photo after the fact, and the trauma to the victim. If none of that computes for you, then I don’t know what to say except I don’t place the right to generate saucy images highly enough to swarm my already overworked police with requests to investigate who generated fake underage porn. | | |
| ▲ | joe_mamba an hour ago | parent [-] | | >A 14-year-old can’t be relied to understand the consequences of making nudes of some girl. Teenagers do stupid shit all the time. But they still get prosecuted or convicted when they do crimes. They go to juvy or their parents get punished. Being 14 is not a get out of jail free card. | | |
| ▲ | vanviegen 37 minutes ago | parent [-] | | In that case, why not allow teenagers to carry firearms as well? Sure, some will die, others will go to jail, but at least that ought to teach the rest of them a lesson, right? |
|
| |
| ▲ | tecoholic 2 hours ago | parent | prev [-] | | The way you are arguing makes it really hard to understand what you are trying to say. I am guessing you are upset that non-human entity is being used as a boogie man while the actual people are going free? But your argumentation reads like someone who is very upset at AI producing CSAM is being persecuted. I won’t be surprised if people think you are defending CSAM. In good faith, a few things - AI generated imagery and Photoshop are not the same. If someone can mail Adobe and a photo of a kid and ask for a modified one and Adobe sent it back, yes Adobe’s offices will be raided. That’s the equivalent here. It’s not a tool. It’s a service. You keep using AI, without taking a moment to give the “intelligence” any thought. Yes, powerful people are always going to get by, as you say. And the laws & judicial system are for the masses. There is definitely unfairness in it. But that doesn’t change anything here - this is a separate conversation. If not Grok then someone else will do it - is a defeatist argument that can only mean it can’t be controlled so don’t bother. This point is where you come across as a CSAM defender. Govt’s will/should do whatever they can to make society safe, even if it means playing whack a mole. Arguing that’s “not efficient” is frankly confusing. Judicial system is about fairness and not efficiency. frankly, I think you understand all of this and maybe got tunnel visioned in your anger at the unfairness of people scapegoating technology for its failings. That’s the last thing I want to point out, raiding an office is taking action against the powerful people who build systems without accountability. They are not going to sit the model down and give a talking to. The intention is to identify the responsible party that allows this to happen. |
|
| |
| ▲ | sam-cop-vimes 2 hours ago | parent | prev | next [-] | | You cannot offload all problems to the legal system. It does not have the capacity. Legal issues take time to resolve and the victims have to have the necessary resource to pursue legal action. Grok enabled abuse at scale, which no legal system in the world can keep up with. It doesn't need explanation that generating nudes of people without their consent is a form of abuse. And if the legal system cannot keep up with protecting victims, the problem has to be dealt with at source. | | |
| ▲ | joe_mamba 2 hours ago | parent [-] | | >You cannot offload all problems to the legal system. It does not have the capacity. You definitely can. You don't have to prosecute and send a million people to jail for making and distributing fake AI nudes, you just have to send a couple, and then the problem virtually goes away. People underestimate how effective direct personal accountability is when it comes with harsh consequences like jail time. That's how you fix all issues in society and enforce law abiding behavior. You make the cost of the crime greater than the gains from it, then crucify some people in public to set an example for everyone else. Do people like doing and paying their taxes? No, but they do it anyway. Why is that? Because THEY KNOW that otherwise they go to jail. Obviously the IRS and legal system don't have the capacity to send the whole country to jail if they were to stop paying taxes, but they send enough to jail in order for the majority of the population to not risk it and follow the law. It's really that simple. | | |
| ▲ | TheOtherHobbes 2 hours ago | parent | next [-] | | None of what you've said is true. Deterrence is known to have a very limited effect on behaviour. In this case, it's far simpler to prosecute the source. | | |
| ▲ | joe_mamba 2 hours ago | parent [-] | | >None of what you've said is true. Everything I said is true. >Deterrence is known to have a very limited effect on behaviour. It is insanely effective when actually enforced. It's not effective when the goal is to make it seem ineffective so that people can evade the system. >In this case, it's far simpler to prosecute the source. The "source" is a tool that tomorrow can be in Russia or CHina and you can't prosecute. |
| |
| ▲ | panda-giddiness 25 minutes ago | parent | prev | next [-] | | > People underestimate how effective direct personal accountability is when it comes with harsh consequences like jail time. That's how you fix all issues in society and enforce law abiding behavior. You make the cost of the crime greater than the gains from it, then crucify some people in public to set an example for everyone else And yet criminals still commit crimes. Obviously jail is not the ultimate deterrent you think it is. Nobody commits crimes with the expectation that they'll get caught, and if you only "crucify some people", then most criminals are going to (rightfully) assume that they'll be one of the lucky ones. | |
| ▲ | ljm an hour ago | parent | prev | next [-] | | > You don't have to prosecute and send a million people to jail for making and distributing fake AI nudes, you just have to send a couple, and then the problem virtually goes away. I genuinely cannot tell if you are being comically naïve or extremely obtuse here. You need only look at the world around you to see that this does not, and never will, happen. As another commenter said, this argument is presenting itself as apologia for CSAM and you come across as a defender of the right for a business to create and publish it. I assume you don't actually believe that, but the points you made are compatible. It is as much the responsibility of a platform for providing the services to create illegal material, and also distributing said illegal material. That it happens to be an AI that generates the imagery is not relevant - X and Grok are still the two services responsible for producing and hosting it. Therefore, the accountability falls on those businesses and its leadership just as much as it does the individual user, because ultimately they are facilitating it. To compare to other situations: if a paedophile ring is discovered on the dark web, the FBI doesn't just arrest the individuals involved and leave the website open. It takes the entire thing down including those operating it, even if they themselves were simply providing the server and not partaking in the content. | |
| ▲ | everettp an hour ago | parent | prev [-] | | Actually research shows people regularly overestimate how effective deterrence-based punishment is. Particularly for children and teenagers. How many 14-year-olds do you really think are getting prosecuted and sent to jail for asking Grok to generate a nude of their classmate..? How many 14-year-olds are giving serious thought about their long-term future in the moment they are typing a prompt into to Twitter..? Your argument is akin to suggesting that carmakers should sell teenagers cars to drive, because the teenager can be punished if they cause an accident. |
|
| |
| ▲ | _pdp_ an hour ago | parent | prev | next [-] | | You know there is no such thing as the world police or something of that sort. If the perpetrator is in another country / jurisdiction it is virtually impossible to prosecute let alone sentence. It is 100% regulatory problem in this case. You just cannot allow this content to be generated and distributed in the public domain by anonymous users. It has nothing to do with free speech but with civility and common understanding of what is morally wrong / right. Obviously you cannot prevent this in private forums unless it is made illegal which is a completely different problem that requires a very different solution. | |
| ▲ | anonymous908213 4 hours ago | parent | prev | next [-] | | Have you considered that it is possible for two things to be problems? | | |
| ▲ | joe_mamba 3 hours ago | parent [-] | | No, because the comment is in bad faith, it just introduced an unrelated issue (poor sentencing from authorities) as an argument for the initial issue we are discussing (AI nudes), derailing the conversation, and then using the new issue they themselves introduced to legitimize their poor argument when one has nothing to do with the other and both can be good/bad independently of each other. I don't accept this as good faith argumentation nor does HN rules. | | |
| ▲ | Phelinofist 3 hours ago | parent | next [-] | | You are the only one commenting in bad faith, by refusing to understand/acknowledging that the people using Grok to create such pictures AND Grok are both part of the issue. It should not be possible to create nudes of minors via Grok. Full stop. | | |
| ▲ | joe_mamba 3 hours ago | parent [-] | | >You are the only one commenting in bad faith For disagreeing on the injection of offtopic hypothetical scenarios as an argument derailing the main topic? >It should not be possible to create nudes of minors via Grok. I agree with THIS part, I don't agree with the part where the main blame is on the AI, instead of on the people using it. That's not a bad faith argument, it's just My PoV. If Grok disappears tomorrow, there will be other AIs from other parts of the world outside of US/EU jurisdiction, that will do the same since the cat is out of the bag and the technical barrier to entry is dropping fast. Do you keep trying to whack-a-mole the AI tools for this, or the humans actually making and distributing fake nudes of real people? | | |
| ▲ | pka an hour ago | parent | next [-] | | > Do you keep trying to whack-a-mole the AI tools for this, or the humans actually making and distributing fake nudes of real people? Both, obviously. For example, you go after drug distributors and drug producers. Both approaches are effective in different ways, I am not sure why you are having such trouble understanding this. | |
| ▲ | TheOtherHobbes 2 hours ago | parent | prev [-] | | This is textbook whataboutery. The law is perfectly clear on this, and Musk is liable. Other AIs have guardrails. If Musk chooses not to implement them, that's his personal irresponsibility. |
|
| |
| ▲ | Bluescreenbuddy an hour ago | parent | prev [-] | | Then log off. |
|
| |
| ▲ | lukan 4 hours ago | parent | prev [-] | | Grok made the pictures. The school authorities messed up. Both are accuntable. | | |
| ▲ | joe_mamba 3 hours ago | parent [-] | | >Grok made the pictures. Correction: kids made the pictures. Using Grok as the tool. If kids were to "git gud" at photoshop and use that to make nudes, would you arrest Adobe? | | |
| ▲ | defrost 3 hours ago | parent | next [-] | | In the spirit of shitty "If's ..." If kids ask a newspaper vendor for cigarettes and he provides them .. that's a no-no. If kids ask a newspaper vendor for nudes and he provides them .. that's a no-no. If kids ask Grok for CSAM and it provides them .. then ? | | |
| ▲ | joe_mamba 3 hours ago | parent | next [-] | | The existence and creation of cigarettes and adult nude magazines is fully legal, only their sale is illegal to kids. If kids try to illegally obtain those LEGAL items, it doesn't make the existence of those items illegal, just the act of sale to them. Meanwhile, the existence/creation CSAM of actual people isn't legal, for anyone no matter the age. | | |
| ▲ | notachatbot123 3 hours ago | parent | next [-] | | Grok created those images. | |
| ▲ | pasc1878 2 hours ago | parent | prev [-] | | And when the magazines get sold who is breaking the law and gets convicted it is not the children but the shop supplying the children. So when Grok provides the illegal pictures then by the same logic it is Grok that is breaking the law. |
| |
| ▲ | abc123abc123 2 hours ago | parent | prev [-] | | If parents or school let children play with explosives or do drugs and they get hurt, that's a no-no. If parents or school let children roam the internet unsupervised... then? | | |
| ▲ | defrost 2 hours ago | parent | next [-] | | > If parents or school let children play with explosives or do drugs The explosive sellers that provide explosives to someone without a certification (child or adult) get in trouble (in this part of the world) .. regardless of whether someone gets hurt (although that's an upscale). If sellers provide ExPo to certified parents and children get access .. that's on the parents. In that analagy of yours, if grok provided ExPo or CSAM to children .. that's a grok problem, (Ditto drugs). It's on the provider to children. ie Grok. | |
| ▲ | actionfromafar 2 hours ago | parent | prev [-] | | If MechaGrok sells explosives to children, that's a go-go? |
|
| |
| ▲ | tene80i 3 hours ago | parent | prev | next [-] | | You're suggesting an inconsistency where there isn't one. A country can ban guns and allow rope, even though both can kill. | | |
| ▲ | joe_mamba 3 hours ago | parent [-] | | > A country can ban guns and allow rope, even though both can kill. That's actually a good argument. And that's how the UK ending up banning not just guns, but all sorts of swords, machetes and knives, meanwhile the violent crime rates have not dropped. So maybe dangerous knives are not the problem, but the people using them to kill other people. So then where do we draw the line between lethal weapons and crime correlation. At which cutting/shooting instruments? Same with software tools, that keep getting more powerful with time lowering the bar to entry for generating nudes of people. Where do we draw the line on which tools are responsible for that instead of the humans using them for it? | | |
| ▲ | tene80i 2 hours ago | parent | next [-] | | You’re absolutely right that it is a difficult question where to draw the line. Different countries will do it differently according to their devotion to individual freedoms vs communal welfare. The knife (as opposed to sword) example is interesting. In the U.K. you’re not allowed to sell them to children. We recognise that there is individual responsibility at play, and children might not be responsible enough to buy them, given the possible harms. Does this totally solve their use in violent crime? No. But if your alternative is “it’s up to the individuals to be responsible”, well, that clearly doesn’t work, because some people are not responsible. At a certain point, if your job is to reduce harm in the population, you look for where you can have a greater impact than just hoping every individual follows the law, because they clearly don’t. And you try things even if they don’t totally solve the problem. And indeed, the same problem in software. As for the violent crime rates in the U.K., I don’t have those stats to hand. But murder is at a 50 year low. And since our post-Dunblane gun laws, we haven’t had any school shootings. Most Britons are happy with that bargain. | |
| ▲ | jen20 3 hours ago | parent | prev [-] | | > meanwhile the violent crime rates have not dropped. The rate of school shootings has dropped from one (before the implementation of recommendations from the Cullen report) to zero (subsequently). Zero in 29 years - success by any measure. If you choose to look at _other_ types of violent crime, why would banning handguns have any effect? > Where do we draw the line on which tools are responsible for that instead of the humans using them for it? You can ban tools which enable bad outcomes without sufficient upside, while also holding the people who use them to account. |
|
| |
| ▲ | lukan 3 hours ago | parent | prev [-] | | "Correction: kids made the pictures. Using Grok as the tool." No. That is not how AI nowdays works. Kids told the tool what they want and the tool understood and could have refused like all the other models - but instead it delivered. And it only could do so because it was specifically trained for that. "If kids were to "git gud" at photoshop " And what is that supposed to mean? Adobe makes general purpose tools as far as I know. | | |
| ▲ | joe_mamba 3 hours ago | parent [-] | | You're beating it around the bush not answering the main question. Anyone skilled at photoshop can do fake nudes as good or even better than AI, including kids (we used it to make fun fakes of teachers in embarrassing situations back in the mid 00s and distribute them via MSN messenger), so then why is only the AI tool the one to blame for what the users do, but not Photoshop if both tools can be used to do the same thing? People can now 3D print guns at home, or at least parts that when assembled can make a functioning firearm. Are now 3D printer makers to blame if someone gets killed with a 3D printed gun? Where do we draw the line at tools in terms of effort required, between when the tool bares the responsibility and not just the human using the tool to do illegal things? This is the answer I'm looking for and I don't think there is an easy one, yet people here are too quick to pin blame based on their emotional responses and subjective biases and word views on the matter and the parties involved. | | |
| ▲ | cbolton an hour ago | parent | next [-] | | > Anyone skilled at photoshop So let's say there are two ways to do something illegal. The first requires skills from the perpetrator, is tricky to regulate, and is generally speaking not a widespread issue in practice. The second way is a no brainer even for young children to use, is easy to regulate, and is becoming a huge issue in practice. Then it makes sense to regulate only the second. > People can now 3D print guns at home, or at least parts that when assembled can make a functioning firearm. Are now 3D printer makers to blame if someone gets killed with a 3D printed gun? Tricky question, but a more accurate comparison would be with a company that runs a service to 3D print guns (= generating the image) and shoot with them in the street (= publishing on X) automatically for you and keeps accepting illegal requests while the competitors have no issue blocking them. > Where do we draw the line at tools in terms of effort required, between when the tool bares the responsibility and not just the human using the tool to do illegal things? That's also a tricky question, but generally you don't really need to know precisely where to draw the line. It suffices to know that something is definitely on the wrong side of the line, like X here. | |
| ▲ | szmarczak 2 hours ago | parent | prev [-] | | A 3D printer needs a blueprint. AI has all the blueprints built-in. It can generalize, so the blueprints cannot simply be erased, however at least what we can do is forbid generation of adult content. Harm should be limited. Photoshop requires skill and manual work, that's the difference. In the end, yes, people are the ones who are responsible for their actions. We shouldn't let kids (or anyone else) harm others with little to no effort. Let's be reasonable. |
|
|
|
|
|