| ▲ | lukan 5 hours ago | |||||||||||||
"Correction: kids made the pictures. Using Grok as the tool." No. That is not how AI nowdays works. Kids told the tool what they want and the tool understood and could have refused like all the other models - but instead it delivered. And it only could do so because it was specifically trained for that. "If kids were to "git gud" at photoshop " And what is that supposed to mean? Adobe makes general purpose tools as far as I know. | ||||||||||||||
| ▲ | joe_mamba 5 hours ago | parent [-] | |||||||||||||
You're beating it around the bush not answering the main question. Anyone skilled at photoshop can do fake nudes as good or even better than AI, including kids (we used it to make fun fakes of teachers in embarrassing situations back in the mid 00s and distribute them via MSN messenger), so then why is only the AI tool the one to blame for what the users do, but not Photoshop if both tools can be used to do the same thing? People can now 3D print guns at home, or at least parts that when assembled can make a functioning firearm. Are now 3D printer makers to blame if someone gets killed with a 3D printed gun? Where do we draw the line at tools in terms of effort required, between when the tool bares the responsibility and not just the human using the tool to do illegal things? This is the answer I'm looking for and I don't think there is an easy one, yet people here are too quick to pin blame based on their emotional responses and subjective biases and word views on the matter and the parties involved. | ||||||||||||||
| ||||||||||||||