Remix.run Logo
KaiserPro 7 hours ago

> THat's like the 1993 moral panic that video games like Doom cause mass shootings,

Apart from doom wasn't producing illegal content.

the point is that grok is generating illegal content for those jurisdictions. In france you can't generate CSAM, in the UK you can't distribute CSAM. Those are actual laws with legal tests, none of them need to be of actual people, they just need to depict _children_ to be illegal.

Moral panics require new laws to enforce, generally. This is just enforcing already existing laws.

More over, had it been any other site, it would have been totally shut down by now and the servers impounded. Its only because musk is close to trump and rich that he's escaped the fate than you or I would have had if we'd done the same.

joe_mamba 6 hours ago | parent [-]

>Apart from doom wasn't producing illegal content.

Sure but where's the proof that Grok is actually producing illegal content? I searched for news sources, but they're just all parroting empty accusations not concrete documented cases.

pasc1878 5 hours ago | parent | next [-]

See https://www.bbc.co.uk/news/articles/cvg1mzlryxeo

Note that IWF is not a random charity it works with the Police on these matters.

I found this as the first item in Kagi search - perhaps you should try non AI searches

KaiserPro 5 hours ago | parent | prev [-]

> but they're just all parroting empty accusations not concrete documented cases.

In the UK it is illegal to create, distribute and store CSAM. A news site printing a photo CSAM would make them legally up the shitter.

However, the IWF, who are tasked with detecting this stuff have claimed to have found evidence of it, along with multiple other sources, Ofcom who are nominally supposed to police this have an open investigation, so do the irish police.

The point is, law has a higher threshold of proof than news, which takes time. If there is enough evidence, then a court case (or other instrument) will be invoked.