Remix.run Logo
Phelinofist 4 hours ago

You are the only one commenting in bad faith, by refusing to understand/acknowledging that the people using Grok to create such pictures AND Grok are both part of the issue. It should not be possible to create nudes of minors via Grok. Full stop.

joe_mamba 4 hours ago | parent [-]

>You are the only one commenting in bad faith

For disagreeing on the injection of offtopic hypothetical scenarios as an argument derailing the main topic?

>It should not be possible to create nudes of minors via Grok.

I agree with THIS part, I don't agree with the part where the main blame is on the AI, instead of on the people using it. That's not a bad faith argument, it's just My PoV.

If Grok disappears tomorrow, there will be other AIs from other parts of the world outside of US/EU jurisdiction, that will do the same since the cat is out of the bag and the technical barrier to entry is dropping fast.

Do you keep trying to whack-a-mole the AI tools for this, or the humans actually making and distributing fake nudes of real people?

pka 3 hours ago | parent | next [-]

> Do you keep trying to whack-a-mole the AI tools for this, or the humans actually making and distributing fake nudes of real people?

Both, obviously. For example, you go after drug distributors and drug producers. Both approaches are effective in different ways, I am not sure why you are having such trouble understanding this.

TheOtherHobbes 3 hours ago | parent | prev [-]

This is textbook whataboutery. The law is perfectly clear on this, and Musk is liable.

Other AIs have guardrails. If Musk chooses not to implement them, that's his personal irresponsibility.