Remix.run Logo
techblueberry a day ago

I'm not saying I'm entirely against this, but just out of curiosity, what do they hope to find in a raid of the french offices, a folder labeled "Grok's CSAM Plan"?

rsynnott a day ago | parent | next [-]

> what do they hope to find in a raid of the french offices, a folder labeled "Grok's CSAM Plan"?

You would be _amazed_ at the things that people commit to email and similar.

Here's a Facebook one (leaked, not extracted by authorities): https://www.reuters.com/investigates/special-report/meta-ai-...

plopilop 12 minutes ago | parent [-]

I mean, the example you link is probably an engineer doing their job of signalling to hierarchy that something went deeply wrong. Of course, the lack of action of Facebook afterwards is a proof that they did not care, but not as much as a smoking gun.

A smoking gun would be, for instance, Facebook observing that most of their ads are scam, that the cost of fixing this exceeds by far "the cost of any regulatory settlement involving scam ads.", and to conclude that the company’s leadership decided to act only in response to impending regulatory action.

https://www.reuters.com/investigations/meta-is-earning-fortu...

afavour a day ago | parent | prev | next [-]

It was known that Grok was generating these images long before any action was taken. I imagine they’ll be looking for internal communications on what they were doing, or deciding not to do, doing during that time.

direwolf20 18 hours ago | parent | prev | next [-]

Maybe emails between the French office and the head office warning they may violate laws, and the response by head office?

arppacket 19 hours ago | parent | prev | next [-]

There was a WaPo article yesterday, that talked about how xAI deliberately loosened Grok’s safety guardrails and relaxed restrictions on sexual content in an effort to make the chatbot more engaging and “sticky” for users. xAI employees had to sign new waivers in the summer, and start working with harmful content, in order to train and enable those features.

I assume the raid is hoping to find communications to establish that timeline, maybe internal concerns that were ignored? Also internal metrics that might show they were aware of the problem. External analysts said Grok was generating a CSAM image every minute!!

https://www.washingtonpost.com/technology/2026/02/02/elon-mu...

chrisjj 14 hours ago | parent [-]

> External analysts said Grok was generating a CSAM image every minute!!

> https://www.washingtonpost.com/technology/2026/02/02/elon-mu...

That article has no mention of CSAM. As expected, since you can bet the Post has lawyers checking.

bluegatty 3 hours ago | parent | prev | next [-]

Email history caches. They could also have provided requirements to provide communications etc..

pjc50 6 hours ago | parent | prev | next [-]

Since the release of (some of) the Epstein files, that kind of "let's do some crimes" email seems much more plausible.

Mordisquitos a day ago | parent | prev | next [-]

What do they hope to find, specifically? Who knows, but maybe the prosecutors have a better awareness of specifics than us HN commenters who have not been involved in the investigation.

What may they find, hypothetically? Who knows, but maybe an internal email saying, for instance, 'Management says keep the nude photo functionality, just hide it behind a feature flag', or maybe 'Great idea to keep a backup of the images, but must cover our tracks', or perhaps 'Elon says no action on Grok nude images, we are officially unaware anything is happening.'

cwillu a day ago | parent [-]

Or “regulators don't understand the technology; short of turning it off entirely, there's nothing we can do to prevent it entirely, and the costs involved in attempting to reduce it are much greater than the likely fine, especially given that we're likely to receive such a fine anyway.”

bawolff 11 hours ago | parent | next [-]

Wouldn't surprise me, but they would have to be very incompetent to say that outside of attorney-client privledge convo.

Otoh it is musk.

pirates a day ago | parent | prev [-]

They could shut it off out of a sense of decency and respect, wtf kind of defense is this?

cwillu 21 hours ago | parent [-]

You appear to have lost the thread (or maybe you're replying to things directly from the newcomments feed? If so, please stop it.), we're talking about what sort of incriminating written statements the raid might hope to discover.

moolcool a day ago | parent | prev | next [-]

Moderation rules? Training data? Abuse metrics? Identities of users who generated or accessed CSAM?

bryan_w 19 hours ago | parent [-]

Do you think that data is stored at the office? Where do you think the data is stored? The janitors closet?

direwolf20 4 hours ago | parent [-]

My computer has a copy of all the source code I work on

reaperducer 20 hours ago | parent | prev | next [-]

out of curiosity, what do they hope to find in a raid of the french offices, a folder labeled "Grok's CSAM Plan"?

You're not too far off.

There was a good article in the Washington Post yesterday about many many people inside the company raising alarms about the content and its legal risk, but they were blown off by managers chasing engagement metrics. They even made up a whole new metric.

There was also prompts telling the AI to act angry or sexy or other things just to keep users addicted.

chrisjj a day ago | parent | prev [-]

[flagged]

wasabi991011 18 hours ago | parent | next [-]

I don't understand your point.

In a further comment you are using a US-focused organization to define an English-language acronym. How does this relate to a French investigation?

chrisjj 15 hours ago | parent [-]

US uses English - quite a lot actually.

As for how it relates, well if the French do find that "Grok's CSAM Plan" file, they'll need to know what that acronym stands for. Right?

rsynnott a day ago | parent | prev [-]

Item one in that list is CSAM.

chrisjj a day ago | parent [-]

You are mistaken. Item #1 is "images of children of a pornographic nature".

Wheras "CSAM isn’t pornography—it’s evidence of criminal exploitation of kids." https://rainn.org/get-informed/get-the-facts-about-sexual-vi...

ffsm8 19 hours ago | parent | next [-]

You're wrong - at least from the perspective of the commons.

First paragraph on Wikipedia

> Child pornography (CP), also known as child sexual abuse material (CSAM) and by more informal terms such as kiddie porn,[1][2][3] is erotic material that involves or depicts persons under the designated age of majority. The precise characteristics of what constitutes child pornography vary by criminal jurisdiction.[4][5]

Honestly, reading your link got me seriously facepalming. The whole argument seems to be centered around the fact that sexualizing children is disgusting, hence it shouldn't be called porn. While i'd agree that sexualizing kids is disgusting, denying that it's porn on that grounds is feels kinda... Childish? Like someone holding their ears closed and shouting loudly in order not to hear the words the adults around them are saying.

bawolff 11 hours ago | parent | next [-]

I think the idea is that normal porn can be consensual. Material involving children never can be.

Perhaps similar to how we have a word for murder that is different from "killing" even though murder always involves killing.

chrisjj 17 hours ago | parent | prev [-]

> First paragraph on Wikipedia

"...the encyclopedia anyone can edit." Yes, there are people who wish to redefine CSAM to include child porn - including even that between consenting children committing no crime and no abuse.

Compare and contrast Interpol. https://www.interpol.int/en/Crimes/Crimes-against-children/A...

> The whole argument seems to be centered around the fact that sexualizing children is disgusting, hence it shouldn't be called porn.

I have no idea how anyone could reasonably draw that conclusion from this thread.

ffsm8 14 hours ago | parent | next [-]

> have no idea how anyone could reasonably draw that conclusion from this thread.

> > Honestly, reading your link got me seriously facepalming. The whole argument seems to be centered around the fact that sexualizing children is disgusting, hence it shouldn't be called porn.

Where exactly did you get the impression from I made this observation from this comment thread?

Your interpol link seems to be literally using the same argument again from a very casual glance btw.

> We encourage the use of appropriate terminology to avoid trivializing the sexual abuse and exploitation of children.

> Pornography is a term used for adults engaging in consensual sexual acts distributed (mostly) legally to the general public for their sexual pleasure.

chrisjj 12 hours ago | parent [-]

> Where exactly did you get the impression from I made this observation from this comment thread?

I assumed you expected us to know what you were referring to.

14 hours ago | parent | prev [-]
[deleted]
direwolf20 18 hours ago | parent | prev | next [-]

Well, RAINN are stupid then.

CSAM is the woke word for child pornography, which is the normal.word for pornography involving children. Pornography is defined as material aiming to sexually stimulate, and CSAM is that.

chrisjj 17 hours ago | parent [-]

> CSAM is the woke word for child pornography

I fear you could be correct.

direwolf20 15 hours ago | parent [-]

CSAM is to child pornography as MAP is to pedophile. Both words used to refer to a thing without the negative connotation.

FireBeyond 11 hours ago | parent | next [-]

I'd say it was the other way around, MAP is an attempt at avoiding the stigma of pedophile, while CSAM is saying "pornography can be an entirely acceptable, positive, consensual thing, but that's not what 'pornography' involving children is, it's evidence of abuse or exploitation or..."

chrisjj 4 hours ago | parent [-]

Well put.

The term CSAM was adopted in the UK following outrage over the "Gary Glitter Effect" - soaring offence rates driven by news of people caught downloading images of unspeakable abuse crimes getting mild sentences for mere child porn.

This is why many feel strongly about defending the term "CSAM" from those who seek to dilute it to cover e.g. mild Grok-style child porn.

The UK Govt. has announced plans to define CSAM in law.

chrisjj 15 hours ago | parent | prev [-]

> CSSM is to child pornography

CSSM?

chrisjj 12 hours ago | parent | next [-]

Ah. You edited it to CSAM. Thanks.

Well, I'm sure CSAM has negative connotation. Our UK Govt. doesn't keep a database of all CSAM found by the police because its a positive thing.

direwolf20 2 hours ago | parent [-]

Only people who are involved in CSAM arguments on the internet know what CSAM means. Ask some random person on the street if they know what CSAM means. Then ask them if they know what child porn means.

chrisjj 2 hours ago | parent [-]

> Only people who are involved in CSAM arguments on the internet know what CSAM means.

I'm pretty sure you can add all the Governments, police depts and online safety organisations who use this term and rely upon it. Do include the 196 countries which depend on the Interpol CSAM database.

anigbrowl 14 hours ago | parent | prev [-]

Dude just stop, you are being ridiculous now.

anigbrowl 14 hours ago | parent | prev [-]

A distinction without a difference.

Even if some kid makes a video of themselves jerking off for their own personal enjoyment, unprompted by anyone else, if someone else gains access to that (eg a technician at a store or an unprincipled guardian) and makes a copy for themselves they're criminally exploiting the kid by doing so.

guerrilla 11 hours ago | parent | next [-]

Seems like a pretty big difference. It's got to be worse to actually do something to somone in real life than not do that.

anigbrowl 8 hours ago | parent | next [-]

Just because there are different degrees of severity and different ways to offend doesn't make it not contraband.

guerrilla an hour ago | parent [-]

I didn't argue they weren't. The person above me argued that the difference didn't matter. It does.

lysp 10 hours ago | parent | prev [-]

Not really, otherwise perpetrators will just "I was just looking at it, I didn't do anything as bad as creating it". Their act is still illegal.

There was a cartoon picture I remember seeing around 15+ years ago of Bart Simpson performing a sex act. In some jurisdictions (such as Australia), this falls under the legal definition.

guerrilla 10 hours ago | parent [-]

> Not really, otherwise perpetrators

You don't think it's worse to molest a child than to not molest a child?

chrisjj 12 hours ago | parent | prev [-]

> A distinction without a difference.

Huge difference here in Europe. CSAM is a much more serious crime. That's why e.g. Interpol runs a global database of CSAM but doesn't bother for mere child porn.