| ▲ | lawn a day ago | |
In Swedish: https://www.regeringen.se/contentassets/5f881006d4d346b199ca... > Även en bild där ett barn t.ex. genom speciella kameraarrangemang framställs på ett sätt som är ägnat att vädja till sexualdriften, utan att det avbildade barnet kan sägas ha deltagit i ett sexuellt beteende vid avbildningen, kan omfattas av bestämmelsen. Which translated means that the children does not have to be apart of sexual acts and indeed undressing a child using AI could be CSAM. I say "could" because all laws are open to interpretation in Sweden and it depends on the specific image. But it's safe to say that many images produces by Grok are CSAM by Swedish standards. | ||
| ▲ | chrisjj 2 hours ago | parent [-] | |
Thanks, but CSAM includes abuse, and the offence of your quote (via Google Translate) does not. Your quote's offence looks like child porn. Max. 2 years jail. CSAM goes up to life, at least here in UK. Quite a difference. > But it's safe to say that many images produces by Grok are CSAM by Swedish standards. So the Govt/police would have acted against Grok, right? Have they? | ||