| ▲ | ljm 2 hours ago | |
> You don't have to prosecute and send a million people to jail for making and distributing fake AI nudes, you just have to send a couple, and then the problem virtually goes away. I genuinely cannot tell if you are being comically naïve or extremely obtuse here. You need only look at the world around you to see that this does not, and never will, happen. As another commenter said, this argument is presenting itself as apologia for CSAM and you come across as a defender of the right for a business to create and publish it. I assume you don't actually believe that, but the points you made are compatible. It is as much the responsibility of a platform for providing the services to create illegal material, and also distributing said illegal material. That it happens to be an AI that generates the imagery is not relevant - X and Grok are still the two services responsible for producing and hosting it. Therefore, the accountability falls on those businesses and its leadership just as much as it does the individual user, because ultimately they are facilitating it. To compare to other situations: if a paedophile ring is discovered on the dark web, the FBI doesn't just arrest the individuals involved and leave the website open. It takes the entire thing down including those operating it, even if they themselves were simply providing the server and not partaking in the content. | ||