| ▲ | joe_mamba 5 hours ago |
| >Grok made the pictures. Correction: kids made the pictures. Using Grok as the tool. If kids were to "git gud" at photoshop and use that to make nudes, would you arrest Adobe? |
|
| ▲ | defrost 5 hours ago | parent | next [-] |
| In the spirit of shitty "If's ..." If kids ask a newspaper vendor for cigarettes and he provides them .. that's a no-no. If kids ask a newspaper vendor for nudes and he provides them .. that's a no-no. If kids ask Grok for CSAM and it provides them .. then ? |
| |
| ▲ | joe_mamba 5 hours ago | parent | next [-] | | The existence and creation of cigarettes and adult nude magazines is fully legal, only their sale is illegal to kids. If kids try to illegally obtain those LEGAL items, it doesn't make the existence of those items illegal, just the act of sale to them. Meanwhile, the existence/creation CSAM of actual people isn't legal, for anyone no matter the age. | | |
| ▲ | notachatbot123 4 hours ago | parent | next [-] | | Grok created those images. | |
| ▲ | pasc1878 4 hours ago | parent | prev [-] | | And when the magazines get sold who is breaking the law and gets convicted it is not the children but the shop supplying the children. So when Grok provides the illegal pictures then by the same logic it is Grok that is breaking the law. |
| |
| ▲ | abc123abc123 4 hours ago | parent | prev [-] | | If parents or school let children play with explosives or do drugs and they get hurt, that's a no-no. If parents or school let children roam the internet unsupervised... then? | | |
| ▲ | defrost 3 hours ago | parent | next [-] | | > If parents or school let children play with explosives or do drugs The explosive sellers that provide explosives to someone without a certification (child or adult) get in trouble (in this part of the world) .. regardless of whether someone gets hurt (although that's an upscale). If sellers provide ExPo to certified parents and children get access .. that's on the parents. In that analagy of yours, if grok provided ExPo or CSAM to children .. that's a grok problem, (Ditto drugs). It's on the provider to children. ie Grok. | |
| ▲ | actionfromafar 3 hours ago | parent | prev [-] | | If MechaGrok sells explosives to children, that's a go-go? |
|
|
|
| ▲ | tene80i 5 hours ago | parent | prev | next [-] |
| You're suggesting an inconsistency where there isn't one. A country can ban guns and allow rope, even though both can kill. |
| |
| ▲ | joe_mamba 4 hours ago | parent [-] | | > A country can ban guns and allow rope, even though both can kill. That's actually a good argument. And that's how the UK ending up banning not just guns, but all sorts of swords, machetes and knives, meanwhile the violent crime rates have not dropped. So maybe dangerous knives are not the problem, but the people using them to kill other people. So then where do we draw the line between lethal weapons and crime correlation. At which cutting/shooting instruments? Same with software tools, that keep getting more powerful with time lowering the bar to entry for generating nudes of people. Where do we draw the line on which tools are responsible for that instead of the humans using them for it? | | |
| ▲ | tene80i 4 hours ago | parent | next [-] | | You’re absolutely right that it is a difficult question where to draw the line. Different countries will do it differently according to their devotion to individual freedoms vs communal welfare. The knife (as opposed to sword) example is interesting. In the U.K. you’re not allowed to sell them to children. We recognise that there is individual responsibility at play, and children might not be responsible enough to buy them, given the possible harms. Does this totally solve their use in violent crime? No. But if your alternative is “it’s up to the individuals to be responsible”, well, that clearly doesn’t work, because some people are not responsible. At a certain point, if your job is to reduce harm in the population, you look for where you can have a greater impact than just hoping every individual follows the law, because they clearly don’t. And you try things even if they don’t totally solve the problem. And indeed, the same problem in software. As for the violent crime rates in the U.K., I don’t have those stats to hand. But murder is at a 50 year low. And since our post-Dunblane gun laws, we haven’t had any school shootings. Most Britons are happy with that bargain. | |
| ▲ | jen20 4 hours ago | parent | prev [-] | | > meanwhile the violent crime rates have not dropped. The rate of school shootings has dropped from one (before the implementation of recommendations from the Cullen report) to zero (subsequently). Zero in 29 years - success by any measure. If you choose to look at _other_ types of violent crime, why would banning handguns have any effect? > Where do we draw the line on which tools are responsible for that instead of the humans using them for it? You can ban tools which enable bad outcomes without sufficient upside, while also holding the people who use them to account. |
|
|
|
| ▲ | verdverm 30 minutes ago | parent | prev | next [-] |
| it's surprising how far people will go to defend CSAM |
|
| ▲ | lukan 5 hours ago | parent | prev [-] |
| "Correction: kids made the pictures. Using Grok as the tool." No. That is not how AI nowdays works. Kids told the tool what they want and the tool understood and could have refused like all the other models - but instead it delivered. And it only could do so because it was specifically trained for that. "If kids were to "git gud" at photoshop " And what is that supposed to mean? Adobe makes general purpose tools as far as I know. |
| |
| ▲ | joe_mamba 5 hours ago | parent [-] | | You're beating it around the bush not answering the main question. Anyone skilled at photoshop can do fake nudes as good or even better than AI, including kids (we used it to make fun fakes of teachers in embarrassing situations back in the mid 00s and distribute them via MSN messenger), so then why is only the AI tool the one to blame for what the users do, but not Photoshop if both tools can be used to do the same thing? People can now 3D print guns at home, or at least parts that when assembled can make a functioning firearm. Are now 3D printer makers to blame if someone gets killed with a 3D printed gun? Where do we draw the line at tools in terms of effort required, between when the tool bares the responsibility and not just the human using the tool to do illegal things? This is the answer I'm looking for and I don't think there is an easy one, yet people here are too quick to pin blame based on their emotional responses and subjective biases and word views on the matter and the parties involved. | | |
| ▲ | cbolton 3 hours ago | parent | next [-] | | > Anyone skilled at photoshop So let's say there are two ways to do something illegal. The first requires skills from the perpetrator, is tricky to regulate, and is generally speaking not a widespread issue in practice. The second way is a no brainer even for young children to use, is easy to regulate, and is becoming a huge issue in practice. Then it makes sense to regulate only the second. > People can now 3D print guns at home, or at least parts that when assembled can make a functioning firearm. Are now 3D printer makers to blame if someone gets killed with a 3D printed gun? Tricky question, but a more accurate comparison would be with a company that runs a service to 3D print guns (= generating the image) and shoot with them in the street (= publishing on X) automatically for you and keeps accepting illegal requests while the competitors have no issue blocking them. > Where do we draw the line at tools in terms of effort required, between when the tool bares the responsibility and not just the human using the tool to do illegal things? That's also a tricky question, but generally you don't really need to know precisely where to draw the line. It suffices to know that something is definitely on the wrong side of the line, like X here. | |
| ▲ | szmarczak 3 hours ago | parent | prev [-] | | A 3D printer needs a blueprint. AI has all the blueprints built-in. It can generalize, so the blueprints cannot simply be erased, however at least what we can do is forbid generation of adult content. Harm should be limited. Photoshop requires skill and manual work, that's the difference. In the end, yes, people are the ones who are responsible for their actions. We shouldn't let kids (or anyone else) harm others with little to no effort. Let's be reasonable. |
|
|