Remix.run Logo
utopiah 11 hours ago

To people claiming a physical raid is pointless from the point of gathering data :

- you are thinking about a company doing good things the right way. You are thinking about a company abiding by the law, storing data on its own server, having good practices, etc.

The moment a company starts to do dubious stuff then good practices start to go out the window. People write email with cryptic analogies, people start deleting emails, ... then as the circumvention become more numerous and complex, there needs to still be a trail in order to remain understandable. That trail will be in written form somehow and that must be hidden. It might be paper, it might be shadow IT but the point is that if you are not just forgetting to keep track of coffee pods at the social corner, you will leave traces.

So yes, raids do make sense BECAUSE it's about recurring complex activities that are just too hard to keep in the mind of one single individual over long periods of time.

SilverBirch 5 hours ago | parent | next [-]

It's also just very basic police work. We're investigating this company, we think they've committed a crime. Ok, why do you think that. Well they've very publicly and obviously committed a crime. Ok, are you going to prosecute them? Probably. Have you gone to their offices and gathered evidence? No thanks.

Of course they're going to raid their offices! They're investigating a crime! It would be quite literally insane if they tried to prosecute them for a crime and how up to court having not even attempted basic steps to gather evidence!

NooneAtAll3 3 hours ago | parent | next [-]

that's kinda the normalization argument, not the reason behind it

"it is done because it's always done so"

monsieurbanana 2 hours ago | parent | next [-]

I'm not sure what you're getting at, physical investigation is the common procedure. You need a reason _not_ to do it, and since "it's all digital" is not a good reason we go back to doing the usual thing.

mothballed 2 hours ago | parent [-]

It's a show of force. "Look we have big strong men with le guns and the neat jack boots, we can send 12 of them in for every one of you." Whether it is actually needed for evidence is immaterial to that.

an hour ago | parent | next [-]
[deleted]
swiftcoder 33 minutes ago | parent | prev [-]

It can be both things at once. It obviously sends a message, but hey, maybe you get lucky, and someone left a memo in the bin by the printer that blows the case wide open.

pjc50 an hour ago | parent | prev | next [-]

Well, yes, it is actually pretty normal for suspected criminal businesses. What's unusual is that this one has their own publicity engine. Americans are just having trouble coping with the idea of a corporation being held liable for crimes.

More normally it looks like e.g. this in the UK: https://news.sky.com/video/police-raid-hundreds-of-businesse...

CyberGEND more often seem to do smalltime copyright infringement enforcement, but there are a number of authorities with the right to conduct raids.

learingsci an hour ago | parent [-]

“Americans are just having trouble coping with the idea of a corporation being held liable for crimes.”

I’m sorry but that’s absurd even amidst the cacophony of absurdity that comprises public discourse these days.

Rygian an hour ago | parent [-]

I'll bite.

How was TikTok held liable for the crimes it was accused of?

throwpoaster 2 minutes ago | parent | next [-]

It was force-sold to Oracle.

an hour ago | parent | prev [-]
[deleted]
DetroitThrow 3 hours ago | parent | prev [-]

Isn't it both necessary and normal if they need more information about why they were generating CSAM? I don't know why the rule of law shouldn't apply to child pornography or why it would be incorrect to normalize the prosecution of CSAM creators.

throwaway290 an hour ago | parent | prev [-]

EU wants to circumvent e2e to fight CSA: "nooo think about my privacy, what happened to just normal police work?"

Police raids offices literally investigating CSA: "nooo police should not physically invade, what happened to good old electronic surveillance?"

almosthere 8 minutes ago | parent | prev | next [-]

At some point in the near future I see a day where our work laptops are nothing more than a full screen streaming video to a different computer that is housed in a country that has no data extradition treaties and is business friendly.

Because that country and the businesses that support that are going to get RICH from such a service.

hybrid_study 4 hours ago | parent | prev | next [-]

The people who think raids are pointless probably use TELNET instead of SSH :-)

exodust 8 hours ago | parent | prev | next [-]

[flagged]

beAbU 7 hours ago | parent | next [-]

It has always been illegal and morally reprehensible to create, own, distribute or store sexually explicit material that represents a real person without their consent, regardless if they are underage or not.

Grok is a platform that is enabling this en masse. If xAI can't bring in guardrails or limit who can access these capabilities, then they deserve what's coming to them.

GaryBluto 4 hours ago | parent | next [-]

>It has always been illegal and morally reprehensible to create, own, distribute or store sexually explicit material that represents a real person without their consent, regardless if they are underage or not.

Arguably morally reprehensible but it has not always been illegal (and still isn't in many places) if you're talking about images of adults.

Cthulhu_ 2 hours ago | parent | next [-]

"Argually" morally reprehensible? I don't think that's very cash money to be honest.

This comment chain isn't about a difference of opinion, but a difference in morality - there's no debating possible about morals I think.

an hour ago | parent [-]
[deleted]
a_better_world 2 hours ago | parent | prev [-]

Did you miss the "without their consent" part?

philipwhiuk 2 hours ago | parent [-]

I think Gary is just reaching back to antiquity.

sandworm101 2 hours ago | parent | prev | next [-]

>> or store sexually explicit material that represents a real person without their consent

Who told you that? Go ask Pamela Anderson or Paris Hilton about that one. There are rules about material created without consent, but people do not retain a perpetual right to have formerly consentual material taken down. Hollywood, let alone the porn industry, would collapse overnight if every disgruntled star could have movies removed whenever they feel like it simply by withdrawing "consent" years after creation.

And for copyright, generally the person on camera is not holding the camera and so is not the creator/owner of the material. That is a regular issue where people attempt to use the dmca to remove images of themselves from websites.

BigTTYGothGF an hour ago | parent [-]

> Paris Hilton

She can't withdraw consent when she never gave it in the first place.

sandworm101 an hour ago | parent [-]

https://en.wikipedia.org/wiki/1_Night_in_Paris

She claimes, later, to have not consented but given the sophistication of the production no reasonable person could believe she had no idea it was being filmed. She might not consent to the exact public release but she was certainly well aware of being filmed on the day. She consented to the creation.

Same issue decades later during the iphone hacks/leaks. They did not consent to public release, but did consent to creation and private distribution, sometimes even taking and initially sharing the photos themselves.

actionfromafar 5 hours ago | parent | prev | next [-]

I think you are going a bit too far.

Let's start from the beginning, create and own:

You're sketching out some nude fanart on a piece of paper. You created that and own that. Thas has always been illegal?!

(This is apart from my feelings on Mechahitler/Grok, which aren't positive.)

reddalo 5 hours ago | parent | next [-]

You can _almost_ do anything you want in the privacy of your home; but in this case Twitter was actively and directly disseminating pictures publicly on their platform.

kimixa 4 hours ago | parent [-]

And profiting from it, though less directly than "$ for illegal images". Even if it wasn't behind a paywall (which it mostly is) driving more traffic for more ads for more income is still profiting from illegal imagery.

andrepd 5 hours ago | parent | prev [-]

> You're sketching out some nude fanart on a piece of paper.

Is twitter a piece of paper in your desk? No, it's not.

actionfromafar 4 hours ago | parent [-]

Right.

OP had "It has always been illegal and morally reprehensible to create, own, distribute or store "

It would make more sense then to instead say:

"It has always been illegal and morally reprehensible to distribute "

andrepd 4 hours ago | parent [-]

Again, AI deepfakes are not sketches in a piece of paper. There's a massive difference between drawing your coworker naked on a piece of paper (weird, but certainly not criminal), and going "grok generate a video of my coworker bouncing on my d*ck". Not to mention the latter is generated and stored god knows where, against the consent of the depicted person.

master-lincoln 5 hours ago | parent | prev [-]

In which broken society do you live where this is true? I would say drawing sexually explicit pictures of real persons without their consent and keeping them in your drawer is neither illegal nor morally reprehensible in most of the world.

I am with you on publishing these...

Cthulhu_ 2 hours ago | parent | next [-]

I don't think one internet commenter can know or decide what is or isn't "morally reprehensible" in "most of the world". I don't speak for "most of the world" but I'm fairly sure "I drew nudes of your mom lol" will not go down well anywhere.

master-lincoln an hour ago | parent [-]

then we are both sure of different things that are hard to check ¯\_(ツ)_/¯

freetanga 3 hours ago | parent | prev [-]

Not morally reprehensible? Do you tell your coworkers “hey, last night I sketched you nude, but it’s cool, it’s in my bedside drawer…”

master-lincoln 2 hours ago | parent | next [-]

I would personally not tell them because not everyone likes to know what/that others think about them. But I do not see the moral issue if I don't tell them.

What if I only thought about it? Still morally reprehensible? Or only if I tell others I think about them? Then you could argue it's sexual harassment.

mothballed 3 hours ago | parent | prev [-]

I've had a coworker tell me something very close to that before. I could have been morally outraged but instead I just propositioned them for a date.

Personally I think it's just embarrassing, not immoral.

ed_elliott_asc 7 hours ago | parent | prev | next [-]

At my kids school the children have been using grok to create pics of other children without clothes on - chatgpt etc won’t let you do that - grok needs some controls and x seem unable to do that themselves.

3 hours ago | parent | next [-]
[deleted]
YetAnotherNick 7 hours ago | parent | prev [-]

What would raiding the office achieve in this case apart from just showing off power.

myrmidon 7 hours ago | parent | next [-]

In such a case specifically: Uncover internal communication that shows the company was aware of the problem and ignored it, which presumably affects liability a lot.

ndr 7 hours ago | parent [-]

I wonder what they will find. They seemed to have acknowledged working on the problem before.

https://x.com/elonmusk/status/2011432649353511350

verdverm 2 hours ago | parent [-]

Have you seen some of the stuff in the Enron or Epstein emails? They can be rather candid and act as if there is nothing to hide or they will never get caught

Elites need a reckoning

mortarion 6 hours ago | parent | prev | next [-]

This is the cyber crime unit. They will exfiltrate any data they want. They will use employee account to pivot into the rest of the X network. They don't just go in, grab a couple of papers, laptops and phones. They hook into the network and begin cracking.

stuaxo 7 hours ago | parent | prev | next [-]

Why are you defending X here?

It sounds like they are following due process.

CaptWillard 2 hours ago | parent [-]

> Why are you defending X here?

What a strange question.

pjc50 7 hours ago | parent | prev | next [-]

Normally getting raided by the police causes people and organizations to change their behavior.

owebmaster 6 hours ago | parent | prev [-]

Enforcing the law usually is an inhibitor for criminals

actionfromafar 5 hours ago | parent [-]

But, isn't that bad for the criminals?

ImPleadThe5th 7 hours ago | parent | prev | next [-]

How about you come back when your daughter has a fake AI nude passed around school.

ljsprague 7 hours ago | parent | next [-]

[flagged]

wtcactus 6 hours ago | parent | prev | next [-]

So, when they were doing it for the last 3 decades in Photoshop (I was in high-school and this already existed) you would be just fine with the tool being used to do it and with the boys and the school?

Is that your argument? Did you ever expect the government to go after Adobe for "enabling" this?

sam-cop-vimes 6 hours ago | parent [-]

Not the same - the barrier to entry was too high. Most people don't have the skills to edit photos using Photoshop. Grok enabled this to happen to scale for users who are complete non techies. With grok, anyone who could type in a half-coherent sentence in English could generate and disseminate these images.

Edit: clarified the last sentence

wtcactus 5 hours ago | parent [-]

Sorry, but barrier to entry doesn't seem like a very good legal excuse. Goes in the same direction as NY attempts to ban 3D printing because - supposedly - it enables people to more easily make guns.

This is a political action by the French... slowly loosing their relevance, even inside the EU. Nothing else.

janalsncm 5 hours ago | parent [-]

I see what you’re getting at. You’re trying to draw a moral equivalence between photoshop and grok. Where that falls flat for me is the distribution aspect: photoshop would not also publish and broadcast the illegal material.

But police don’t care about moral equivalence. They care about the law. For the legal details we would need to consult French law. But I assume it is illegal to create and distribute the images. Heck, it’s also probably against Twitter’s TOS too so by all rights the grok account should be banned.

> This is a political action by the French

Maybe. They probably don’t like a foreign company coming in, violating their children, and getting away with it. But what Twitter did was so far out of line that I’d be shocked if French companies weren’t treated the same way.

wtcactus 5 hours ago | parent [-]

> But I assume it is illegal to create and distribute the images.

I very much so expect it to be illegal to distribute the images, of course (creating them, not so much).

But the illegality, in a sane world (and until 5 minutes ago) used to be attached to the person actually distributing them. If some student distributes fake sexualized images of their colleague, I very much expect the perpetrator to be punished by the law (and by the school, since we are at it).

manfre 4 hours ago | parent [-]

Creating, possessing, and distributing CSAM is illegal in the US and many other countries. Can you explain why you think it should be legal to create something that is illegal to possess or distribute?

wtcactus 2 hours ago | parent [-]

I didn't say creating isn't illegal. I said I think it probably shouldn't be illegal.

Any crime that doesn't cause victims is just another way for an oppressive collectivist state to further control their citizens. If you are not harming anyone (like when creating but not sharing these pictures) then it simply shouldn't be a crime. Otherwise, what are you actually punishing? Thoughtcrimes?

joe_mamba 7 hours ago | parent | prev | next [-]

In your hypothetical scenario, why aren't the school kids making and distributing fake nudes of his daughter be the ones getting in trouble?

Have we a outsourced all accountability for the crimes of humans to AI now?

ImPleadThe5th 7 hours ago | parent | next [-]

It's not hypothetical. And in fact the girl who was being targeted was expelled not the boys who did it [1].

Those boys absolutely should be held accountable. But I also don't think that Grok should be able to quickly and easily generate fake revenge porn for minors.

[1] https://www.nbcnewyork.com/news/national-international/girl-...

joe_mamba 7 hours ago | parent | next [-]

>And in fact the girl who was being targeted was expelled not the boys who did it [1].

And the AI is at fault for this sentencing, not the school authorities/prosecutors/judges dishing justice? WTF.

How is this an AI problem and not a legal system problem?

pseudony 6 hours ago | parent | next [-]

You can’t “undo” a school shooting, for instance, so we tend to have gun laws.

You can’t just “undo” some girl being harassed by AI generated nude photos of her, so we…

Yes, we should have some protections or restrictions on what you can do.

You may not understand it, either because you aren’t a parent or maybe just not emotionally equipped to understand how serious this actually can be, but your lack of comprehension does not render it a non-issue.

Having schools play whack-a-mole after the photos are shared around is not a valid strategy. Never mind that schools primarily engage in teaching, not in investigation.

As AI-generated content gets less and less distinguishable from reality, these incidents will have far worse consequences and putting such power in the hands of adolescents who demonstrably don’t have sound judgment (hence why they lack many other rights that adults have) is not something most parents are comfortable with - and I doubt you’ll find many teachers, psychiatrists and so on who would support your approach either.

joe_mamba 5 hours ago | parent [-]

>You can’t just “undo” some girl being harassed by AI generated nude photos of her, so we…

No, but if you send those people who made and distributed the AI nude of her to jail, these problems will virtually disappear overnight, because going to jail is a hugely effective deterrent for most people.

But if you don't directly prosecute the people doing it, and instead just ban Grok AI, then those people will just use other AI tools, outside of US jurisdiction, to do the same things and the problem persists.

And the issues keeps persisting, because nobody ever goes to jail. Everyone only gets a slap on the wrist, deflects accountability by blaming the AI, so the issue keeps persisting and more people end up getting hurt because those who do the evil are never held directly accountable.

Obviously Grok shouldn't be legally allowed to generate fakes nudes of actual kids, but in case such safeguards can and will be bypassed, that doesn't absolve the humans from being the ones knowingly breaking the law to achieve a nefarious goal.

pseudony 5 hours ago | parent | next [-]

That’s just not how the world works.

Youths lack judgment, so they can’t vote, drink, drive, have sex or consent to adults.

A 14-year-old can’t be relied to understand the consequences of making nudes of some girl.

Beyond that, we regulate guns, speed limits and more according to principles like “your right to swing your fist ends at my nose”.

We do that not only because shoving kids into jails is something we want to avoid, but because regulating at the source of the problem is both more feasible AND heads off a lot of tragedy.

And again, you fail to acknowledge the investigative burden you put on society to discover who originated the photo after the fact, and the trauma to the victim.

If none of that computes for you, then I don’t know what to say except I don’t place the right to generate saucy images highly enough to swarm my already overworked police with requests to investigate who generated fake underage porn.

joe_mamba 5 hours ago | parent [-]

>A 14-year-old can’t be relied to understand the consequences of making nudes of some girl.

Teenagers do stupid shit all the time. But they still get prosecuted or convicted when they do crimes. They go to juvy or their parents get punished. Being 14 is not a get out of jail free card.

vanviegen 4 hours ago | parent [-]

In that case, why not allow teenagers to carry firearms as well? Sure, some will die, others will go to jail, but at least that ought to teach the rest of them a lesson, right?

jermaustin1 3 hours ago | parent | next [-]

I am in agreement with you, but as a kid, we DID carry guns, regularly. Gun racks in our cars/trucks, and strapped to our backs as we walked down the street.

The problem stems from parents lack of parenting, a huge lack of real after-school programs, and the tiktokification of modern society.

30 years ago, we had a lot of the same "slap on the wrist" punishments because it was assumed when you got home your parent was going to beat your ass. That isn't a thing anymore (rightfully), because parenting through threat of violence just leads to those kids becoming violent parents.

Our problem is we never transitioned from violent parenting into any other kind. I watched my nieces and nephews get parented by YouTube and get social media accounts before they were 10. COVID created a society of chronically online children who don't know how to interact offline.

And yes, the tools to create bad shit are more accessible than ever, and I always come off as some angry gate keeper, but so much of the internet as it is today has become too easy to access by people incapable of the critical thinking required for safe use.

In the last 5 years, generative AI has taken over most of the "public facing" internet, and with internet literacy at the same level it was 20-30 years ago, we are back in the "walled garden" AOL era, but it is Facebook, Instagram, Twitter, TikTok that are the gardens.

mothballed 3 hours ago | parent | prev [-]

Wut? I carried guns regularly from about age 7. Without my parents around. The USA at one point embraced radical freedom. That is the childhood I had, and I thank "god" for it on a regular basis. "Live free or die."

I'm similarly repulsed by the idea of Grok generating images of kids, but if you draw a nude of an adult woman she's not going to get raped by that existing, and you don't have a right to not be embarrassed. Tough shit, deal with it.

Cthulhu_ 2 hours ago | parent | prev | next [-]

> No, but if you send those people who made and distributed the AI nude of her to jail, these problems will virtually disappear overnight, because going to jail is a hugely effective deterrent for most people.

Actually you'll see the opposite happen a lot - after Columbine, the number of school shootings went up [0] for example, because before people didn't consider it an option. Same with serial killers / copycats, and a bunch of other stuff.

Likewise, if it hadn't been in the news, a lot of people wouldn't have known you can / could create nudes of real people with Grok. News reporting on these things is its own kind of unfortunate marketing, and for every X people that are outraged about this, there will be some that are instead inspired and interested.

While a lot of punishments for crimes is indeed a deterrent, it doesn't always work. Also because in this case, it's relatively easy to avoid being found out (unlike school shootings).

[0] https://www.security.org/blog/a-timeline-of-school-shootings...

tecoholic 5 hours ago | parent | prev | next [-]

The way you are arguing makes it really hard to understand what you are trying to say. I am guessing you are upset that non-human entity is being used as a boogie man while the actual people are going free? But your argumentation reads like someone who is very upset at AI producing CSAM is being persecuted. I won’t be surprised if people think you are defending CSAM.

In good faith, a few things - AI generated imagery and Photoshop are not the same. If someone can mail Adobe and a photo of a kid and ask for a modified one and Adobe sent it back, yes Adobe’s offices will be raided. That’s the equivalent here. It’s not a tool. It’s a service. You keep using AI, without taking a moment to give the “intelligence” any thought.

Yes, powerful people are always going to get by, as you say. And the laws & judicial system are for the masses. There is definitely unfairness in it. But that doesn’t change anything here - this is a separate conversation.

If not Grok then someone else will do it - is a defeatist argument that can only mean it can’t be controlled so don’t bother. This point is where you come across as a CSAM defender. Govt’s will/should do whatever they can to make society safe, even if it means playing whack a mole. Arguing that’s “not efficient” is frankly confusing. Judicial system is about fairness and not efficiency.

frankly, I think you understand all of this and maybe got tunnel visioned in your anger at the unfairness of people scapegoating technology for its failings. That’s the last thing I want to point out, raiding an office is taking action against the powerful people who build systems without accountability. They are not going to sit the model down and give a talking to. The intention is to identify the responsible party that allows this to happen.

BigTTYGothGF an hour ago | parent | prev | next [-]

"outside of US jurisdiction"

Did you see the part in the article about "raided in France" and "UK opens fresh investigation"?

verdverm 2 hours ago | parent | prev [-]

> And the issues keeps persisting, because nobody ever goes to jail.

Yes, let's just jail every kid who makes a mistake, ya know, instead of the enablers who should know better as adults...

except for that one guy, let's put him in the white house

sam-cop-vimes 5 hours ago | parent | prev | next [-]

You cannot offload all problems to the legal system. It does not have the capacity. Legal issues take time to resolve and the victims have to have the necessary resource to pursue legal action. Grok enabled abuse at scale, which no legal system in the world can keep up with. It doesn't need explanation that generating nudes of people without their consent is a form of abuse. And if the legal system cannot keep up with protecting victims, the problem has to be dealt with at source.

joe_mamba 5 hours ago | parent [-]

>You cannot offload all problems to the legal system. It does not have the capacity.

You definitely can. You don't have to prosecute and send a million people to jail for making and distributing fake AI nudes, you just have to send a couple, and then the problem virtually goes away.

People underestimate how effective direct personal accountability is when it comes with harsh consequences like jail time. That's how you fix all issues in society and enforce law abiding behavior. You make the cost of the crime greater than the gains from it, then crucify some people in public to set an example for everyone else.

Do people like doing and paying their taxes? No, but they do it anyway. Why is that? Because THEY KNOW that otherwise they go to jail. Obviously the IRS and legal system don't have the capacity to send the whole country to jail if they were to stop paying taxes, but they send enough to jail in order for the majority of the population to not risk it and follow the law.

It's really that simple.

TheOtherHobbes 5 hours ago | parent | next [-]

None of what you've said is true. Deterrence is known to have a very limited effect on behaviour.

In this case, it's far simpler to prosecute the source.

soderfoo 2 hours ago | parent | next [-]

Increased severity of punishment has little deterrent effect, both individually and generally.

The certainty or likelihood of being caught if a far more effevtive deterrent, but require effort, focus, and resources by law enforcement.

It's a resource constraint problem and a policy choice. If "they" wanted to set the tone that this type of behavior will not be tolerated, it would require a concerted multi agency surge of investigative and prosecutorial resources. It's been done before, if there's a will there's a way.

joe_mamba 5 hours ago | parent | prev [-]

>None of what you've said is true.

Everything I said is true.

>Deterrence is known to have a very limited effect on behaviour.

It is insanely effective when actually enforced. It's not effective when the goal is to make it seem ineffective so that people can evade the system.

>In this case, it's far simpler to prosecute the source.

The "source" is a tool that tomorrow can be in Russia or CHina and you can't prosecute.

panda-giddiness 3 hours ago | parent | prev | next [-]

> People underestimate how effective direct personal accountability is when it comes with harsh consequences like jail time. That's how you fix all issues in society and enforce law abiding behavior. You make the cost of the crime greater than the gains from it, then crucify some people in public to set an example for everyone else

And yet criminals still commit crimes. Obviously jail is not the ultimate deterrent you think it is. Nobody commits crimes with the expectation that they'll get caught, and if you only "crucify some people", then most criminals are going to (rightfully) assume that they'll be one of the lucky ones.

everettp 4 hours ago | parent | prev | next [-]

Actually research shows people regularly overestimate how effective deterrence-based punishment is. Particularly for children and teenagers. How many 14-year-olds do you really think are getting prosecuted and sent to jail for asking Grok to generate a nude of their classmate..? How many 14-year-olds are giving serious thought about their long-term future in the moment they are typing a prompt into to Twitter..? Your argument is akin to suggesting that carmakers should sell teenagers cars to drive, because the teenager can be punished if they cause an accident.

ljm 4 hours ago | parent | prev [-]

> You don't have to prosecute and send a million people to jail for making and distributing fake AI nudes, you just have to send a couple, and then the problem virtually goes away.

I genuinely cannot tell if you are being comically naïve or extremely obtuse here. You need only look at the world around you to see that this does not, and never will, happen.

As another commenter said, this argument is presenting itself as apologia for CSAM and you come across as a defender of the right for a business to create and publish it. I assume you don't actually believe that, but the points you made are compatible.

It is as much the responsibility of a platform for providing the services to create illegal material, and also distributing said illegal material. That it happens to be an AI that generates the imagery is not relevant - X and Grok are still the two services responsible for producing and hosting it. Therefore, the accountability falls on those businesses and its leadership just as much as it does the individual user, because ultimately they are facilitating it.

To compare to other situations: if a paedophile ring is discovered on the dark web, the FBI doesn't just arrest the individuals involved and leave the website open. It takes the entire thing down including those operating it, even if they themselves were simply providing the server and not partaking in the content.

anonymous908213 7 hours ago | parent | prev | next [-]

Have you considered that it is possible for two things to be problems?

joe_mamba 7 hours ago | parent [-]

No, because the comment is in bad faith, it just introduced an unrelated issue (poor sentencing from authorities) as an argument for the initial issue we are discussing (AI nudes), derailing the conversation, and then using the new issue they themselves introduced to legitimize their poor argument when one has nothing to do with the other and both can be good/bad independently of each other.

I don't accept this as good faith argumentation nor does HN rules.

Phelinofist 6 hours ago | parent | next [-]

You are the only one commenting in bad faith, by refusing to understand/acknowledging that the people using Grok to create such pictures AND Grok are both part of the issue. It should not be possible to create nudes of minors via Grok. Full stop.

joe_mamba 6 hours ago | parent [-]

>You are the only one commenting in bad faith

For disagreeing on the injection of offtopic hypothetical scenarios as an argument derailing the main topic?

>It should not be possible to create nudes of minors via Grok.

I agree with THIS part, I don't agree with the part where the main blame is on the AI, instead of on the people using it. That's not a bad faith argument, it's just My PoV.

If Grok disappears tomorrow, there will be other AIs from other parts of the world outside of US/EU jurisdiction, that will do the same since the cat is out of the bag and the technical barrier to entry is dropping fast.

Do you keep trying to whack-a-mole the AI tools for this, or the humans actually making and distributing fake nudes of real people?

pka 4 hours ago | parent | next [-]

> Do you keep trying to whack-a-mole the AI tools for this, or the humans actually making and distributing fake nudes of real people?

Both, obviously. For example, you go after drug distributors and drug producers. Both approaches are effective in different ways, I am not sure why you are having such trouble understanding this.

TheOtherHobbes 5 hours ago | parent | prev [-]

This is textbook whataboutery. The law is perfectly clear on this, and Musk is liable.

Other AIs have guardrails. If Musk chooses not to implement them, that's his personal irresponsibility.

Bluescreenbuddy 4 hours ago | parent | prev [-]

Then log off.

_pdp_ 4 hours ago | parent | prev | next [-]

You know there is no such thing as the world police or something of that sort.

If the perpetrator is in another country / jurisdiction it is virtually impossible to prosecute let alone sentence.

It is 100% regulatory problem in this case. You just cannot allow this content to be generated and distributed in the public domain by anonymous users. It has nothing to do with free speech but with civility and common understanding of what is morally wrong / right.

Obviously you cannot prevent this in private forums unless it is made illegal which is a completely different problem that requires a very different solution.

lukan 7 hours ago | parent | prev [-]

Grok made the pictures.

The school authorities messed up.

Both are accuntable.

joe_mamba 7 hours ago | parent [-]

>Grok made the pictures.

Correction: kids made the pictures. Using Grok as the tool.

If kids were to "git gud" at photoshop and use that to make nudes, would you arrest Adobe?

defrost 6 hours ago | parent | next [-]

In the spirit of shitty "If's ..."

If kids ask a newspaper vendor for cigarettes and he provides them .. that's a no-no.

If kids ask a newspaper vendor for nudes and he provides them .. that's a no-no.

If kids ask Grok for CSAM and it provides them .. then ?

joe_mamba 6 hours ago | parent | next [-]

The existence and creation of cigarettes and adult nude magazines is fully legal, only their sale is illegal to kids. If kids try to illegally obtain those LEGAL items, it doesn't make the existence of those items illegal, just the act of sale to them.

Meanwhile, the existence/creation CSAM of actual people isn't legal, for anyone no matter the age.

notachatbot123 6 hours ago | parent | next [-]

Grok created those images.

pasc1878 5 hours ago | parent | prev [-]

And when the magazines get sold who is breaking the law and gets convicted it is not the children but the shop supplying the children.

So when Grok provides the illegal pictures then by the same logic it is Grok that is breaking the law.

abc123abc123 5 hours ago | parent | prev [-]

If parents or school let children play with explosives or do drugs and they get hurt, that's a no-no.

If parents or school let children roam the internet unsupervised... then?

defrost 5 hours ago | parent | next [-]

> If parents or school let children play with explosives or do drugs

The explosive sellers that provide explosives to someone without a certification (child or adult) get in trouble (in this part of the world) .. regardless of whether someone gets hurt (although that's an upscale).

If sellers provide ExPo to certified parents and children get access .. that's on the parents.

In that analagy of yours, if grok provided ExPo or CSAM to children .. that's a grok problem,

(Ditto drugs).

It's on the provider to children. ie Grok.

actionfromafar 5 hours ago | parent | prev [-]

If MechaGrok sells explosives to children, that's a go-go?

tene80i 6 hours ago | parent | prev | next [-]

You're suggesting an inconsistency where there isn't one. A country can ban guns and allow rope, even though both can kill.

joe_mamba 6 hours ago | parent [-]

> A country can ban guns and allow rope, even though both can kill.

That's actually a good argument. And that's how the UK ending up banning not just guns, but all sorts of swords, machetes and knives, meanwhile the violent crime rates have not dropped.

So maybe dangerous knives are not the problem, but the people using them to kill other people. So then where do we draw the line between lethal weapons and crime correlation. At which cutting/shooting instruments?

Same with software tools, that keep getting more powerful with time lowering the bar to entry for generating nudes of people. Where do we draw the line on which tools are responsible for that instead of the humans using them for it?

tene80i 6 hours ago | parent | next [-]

You’re absolutely right that it is a difficult question where to draw the line. Different countries will do it differently according to their devotion to individual freedoms vs communal welfare.

The knife (as opposed to sword) example is interesting. In the U.K. you’re not allowed to sell them to children. We recognise that there is individual responsibility at play, and children might not be responsible enough to buy them, given the possible harms. Does this totally solve their use in violent crime? No. But if your alternative is “it’s up to the individuals to be responsible”, well, that clearly doesn’t work, because some people are not responsible. At a certain point, if your job is to reduce harm in the population, you look for where you can have a greater impact than just hoping every individual follows the law, because they clearly don’t. And you try things even if they don’t totally solve the problem.

And indeed, the same problem in software.

As for the violent crime rates in the U.K., I don’t have those stats to hand. But murder is at a 50 year low. And since our post-Dunblane gun laws, we haven’t had any school shootings. Most Britons are happy with that bargain.

jen20 6 hours ago | parent | prev [-]

> meanwhile the violent crime rates have not dropped.

The rate of school shootings has dropped from one (before the implementation of recommendations from the Cullen report) to zero (subsequently). Zero in 29 years - success by any measure.

If you choose to look at _other_ types of violent crime, why would banning handguns have any effect?

> Where do we draw the line on which tools are responsible for that instead of the humans using them for it?

You can ban tools which enable bad outcomes without sufficient upside, while also holding the people who use them to account.

verdverm 2 hours ago | parent | prev | next [-]

it's surprising how far people will go to defend CSAM

lukan 6 hours ago | parent | prev [-]

"Correction: kids made the pictures. Using Grok as the tool."

No. That is not how AI nowdays works. Kids told the tool what they want and the tool understood and could have refused like all the other models - but instead it delivered. And it only could do so because it was specifically trained for that.

"If kids were to "git gud" at photoshop "

And what is that supposed to mean?

Adobe makes general purpose tools as far as I know.

joe_mamba 6 hours ago | parent [-]

You're beating it around the bush not answering the main question.

Anyone skilled at photoshop can do fake nudes as good or even better than AI, including kids (we used it to make fun fakes of teachers in embarrassing situations back in the mid 00s and distribute them via MSN messenger), so then why is only the AI tool the one to blame for what the users do, but not Photoshop if both tools can be used to do the same thing?

People can now 3D print guns at home, or at least parts that when assembled can make a functioning firearm. Are now 3D printer makers to blame if someone gets killed with a 3D printed gun?

Where do we draw the line at tools in terms of effort required, between when the tool bares the responsibility and not just the human using the tool to do illegal things? This is the answer I'm looking for and I don't think there is an easy one, yet people here are too quick to pin blame based on their emotional responses and subjective biases and word views on the matter and the parties involved.

cbolton 4 hours ago | parent | next [-]

> Anyone skilled at photoshop

So let's say there are two ways to do something illegal. The first requires skills from the perpetrator, is tricky to regulate, and is generally speaking not a widespread issue in practice. The second way is a no brainer even for young children to use, is easy to regulate, and is becoming a huge issue in practice. Then it makes sense to regulate only the second.

> People can now 3D print guns at home, or at least parts that when assembled can make a functioning firearm. Are now 3D printer makers to blame if someone gets killed with a 3D printed gun?

Tricky question, but a more accurate comparison would be with a company that runs a service to 3D print guns (= generating the image) and shoot with them in the street (= publishing on X) automatically for you and keeps accepting illegal requests while the competitors have no issue blocking them.

> Where do we draw the line at tools in terms of effort required, between when the tool bares the responsibility and not just the human using the tool to do illegal things?

That's also a tricky question, but generally you don't really need to know precisely where to draw the line. It suffices to know that something is definitely on the wrong side of the line, like X here.

szmarczak 5 hours ago | parent | prev [-]

A 3D printer needs a blueprint. AI has all the blueprints built-in. It can generalize, so the blueprints cannot simply be erased, however at least what we can do is forbid generation of adult content. Harm should be limited. Photoshop requires skill and manual work, that's the difference. In the end, yes, people are the ones who are responsible for their actions. We shouldn't let kids (or anyone else) harm others with little to no effort. Let's be reasonable.

whywhywhywhy 6 hours ago | parent | prev | next [-]

This happens all the time with abusive children in schools, they're rarely punished at all even with extreme abuse and violence.

jgalt212 4 hours ago | parent | prev [-]

The police got it right.

> When the sheriff's department looked into the case, they took the opposite actions. They charged two of the boys who'd been accused of sharing explicit images — and not the girl.

anonymous908213 7 hours ago | parent | prev | next [-]

Punishing kids after the fact does not stop the damage from occurring. Nothing can stop the damage that has already occurred, but if you stop the source of the nudes, you can stop future damage from occurring to even more girls.

joe_mamba 7 hours ago | parent [-]

[flagged]

anonymous908213 7 hours ago | parent | next [-]

I'm sorry, did the article or anyone in this subthread suggest banning AI? That seems like quite a non-sequitur. I'm pretty sure the idea is to put a content filter on an online platform for one very specific kind of already-illegal content (modified nude images of real people, especially children), which is a far cry from a ban. Nothing can stop local diffusion or Photoshop, of course, but the hardware and technical barriers are so much higher that curtailing Grok would probably cut off 99% or more of the problem material. I suppose you'll tell me if any solution is not 100% effective we should do nothing and embrace anarchy?

Edit for the addition of the line about bullying: "Bullying has always happened, therefore we should allow new forms of even worse bullying to flourish freely, even though I readily acknowledge that it can lead to victims committing suicide" is a bizarre and self-contradictory take. I don't know what point you think you're making.

wizzwizz4 7 hours ago | parent | prev | next [-]

Child sexual abuse material is literally in the training sets. Saying "banning AI" as though it's all the same thing, and all morally-neutral, is disingenuous. (Yes, a system with both nudity and children in its dataset might still be able to produce such images – and there are important discussions to be had about that – but giving xAI the benefit of equivocation here is an act of malice.)

expedition32 5 hours ago | parent | prev | next [-]

Nobody wants to ban AI they want to regulate it. Which is what we do with all new technology.

To paraphrase "your tech bros were so preoccupied with whether or not they could they never considered if they should"

owebmaster 6 hours ago | parent | prev [-]

[flagged]

direwolf20 6 hours ago | parent | next [-]

Y Combinator supports doing anything that makes money

joe_mamba 6 hours ago | parent | prev [-]

I'm not defending CP, WTF is wrong with you? You're just hallucinating/making stuff up in bad faith.

BigTTYGothGF an hour ago | parent | prev | next [-]

> why aren't the school kids making and distributing fake nudes of his daughter be the ones getting in trouble?

"Boys will be boys", and so on. (https://en.wikipedia.org/wiki/Rape_culture)

Cthulhu_ 2 hours ago | parent | prev | next [-]

But they are getting in trouble. However, for every one that gets in trouble, there's more that don't get discovered, or that don't get in trouble for it.

Besides, getting in trouble for something is already after the fact, the damage has been done. If it can't be done in the first place, or the barrier is too high for most, then the damage would have been prevented.

But this is a recurring dilemma.

stuaxo 7 hours ago | parent | prev | next [-]

They may well get in trouble, but in that takes time, in the meantime photos will have been seen by most kids in school + you might get a year of bullying.

Education might be so disrupted you have to change schools.

verdverm 2 hours ago | parent | prev | next [-]

children do dumb things and make mistakes all the time, teenagers push the boundaries as far as they can (and they have a role model in the white house now)

We fault and "fine" companies for providing products that harm society all the time

Are you not going to consider the company providing a CSAM machine to be the major one at fault here?

saubeidl 6 hours ago | parent | prev [-]

This is accountability for the crimes of humans.

The crime is creating a system that lets schoolboys create fake nudes of other minors.

You don't just get to build a CSAM-generator and then be like "well I never intended for it to be used...".

The humans running a company are liable for the product that their company builds, easy as that.

joe_mamba 6 hours ago | parent [-]

>The crime is creating a system that lets schoolboys create fake nudes of other minors.

So like Photoshop? Do you want to raid Adobe's HQ?

saubeidl 6 hours ago | parent [-]

Does Photoshop have a "let me jerk off to this minor" button?

joe_mamba 4 hours ago | parent [-]

[flagged]

saubeidl 4 hours ago | parent [-]

I don't want to.

But I don't want others to be easily able to either.

In your scenario, yes, you are guilty as well. But so is the one that actually did the deed, i.e. Grok in this case.

You're arguing my point for me. Just because you do something for someone else doesn't mean you're absolved of responsibility.

You can't build a tool with a "create child porn" button and then expect not to get into trouble for helping people make child porn.

BlackFly 6 hours ago | parent | prev [-]

I really find this kind of appeal quite odious. God forbid that we expect fathers to have empathy for their sons, sisters, brothers, spouses, mothers, fathers, uncles, aunts, etc. or dare we hope that they might have empathy for friends or even strangers? It's like an appeal to hypocrisy or something. Sure, I know such people exist but it feels like throwing so many people under the bus just to (probably fail) to convince someone of something by appealing to an emotional overprotectiveness of fathers to daughters.

You should want to protect all of the people in your life from such a thing or nobody.

Bluescreenbuddy 4 hours ago | parent | prev | next [-]

If you bothered to do any research instead of downplaying it, you’d know why. It’s embarrassing you even typed that out

sapphicsnail 7 hours ago | parent | prev | next [-]

So making CSAM of real people is ok if an AI is involved?

Leynos 4 hours ago | parent | prev | next [-]

Yes. All because sexual harassment and images depicting child sexual abuse.

athrowaway3z 7 hours ago | parent | prev | next [-]

You're defending X/Grok as if it's a public social platform.

It is a privately controlled public-facing group chat. Being a chat-medium does not grant you the same rights as being a person. France isn't America.

If a company operates to the detriment and against the values of a nation, e.g. not paying their taxes or littering in the environment, the nation will ask them to change their behavior.

If there is a conspiracy of contempt, at some point things escalate.

peder 2 hours ago | parent | prev | next [-]

Largely a left-wing echo chamber here (and also seems to be much more European here than the average forum), so everyone here all thinks Musk is doing something illegal just because he's right-wing.

These raids are entirely political.

saubeidl 6 hours ago | parent | prev | next [-]

[dead]

joe_mamba 8 hours ago | parent | prev [-]

[flagged]

n4r9 7 hours ago | parent | next [-]

You want the French authorities to focus on the Epstein files to the exclusion of all other corporate misbehaviour?

Also, it seems pretty likely that Musk is tangled up with the Epstein shit. First Musk claimed he turned down offer to go to the island. Now it turns out Musk repeatedly sought to visit, including wanting to know when the "wildest" party was happening, after Epstein was already known as a child sex abuser. Musk claimed that Epstein had never been given a tour of SpaceX but it turns out he did in 2013. It's the classic narcissistic "lie for as long as possible" behaviour. Will be interesting to see what happens as more is revealed.

joe_mamba 6 hours ago | parent [-]

>You want the French authorities to focus on the Epstein files to the exclusion of all other corporate misbehaviour?

No i said no such thing, what I said was that the resources of authorities is a finite pie. If most of it goes towards petty stuff like corporate misbehavior that hurts nobody, there won't be enough for the grave crimes like actual child abuser that actually hurt real people.

Same how police won't bother with your stolen phone/bike because they have bigger crimes to catch. I'm asking for the same logic be applied here.

n4r9 6 hours ago | parent | next [-]

There's no indication that this investigation would draw resources away from investigating the Epstein files. It's happening in France, for starters, whilst the vast majority of Epstein's crimes appear to have happened in US territories. Speaking about "the authorities" as if they're a unified global entity sounds a little conspiratorial.

watwut 5 hours ago | parent | prev [-]

> If most of it goes towards petty stuff like corporate misbehavior that hurts nobody, there won't be enough for the grave crimes like actual child abuser that actually hurt real people.

1.) That is not how it works, even if we ignore the fact that France is not USA.

2.) Lack of resources was not the issue with Epstein prosecution. The prosecutor was literally told to not investigate by her superiors who were trying to stop the case. She was told she is unsubordinated for doing it. Acosta giving Epstein sweetheart deal or seeking to stop the prosecutor is not the resources issue.

It is billionaires (Thiel, Musk, Gates), politicians (Clinton, Luthnic ) media darlings (Summers, Kraus and the rest of sexism is totally not a thing anymore crowd literally partying with Epstein) are to be protected at all cost issue. Even now, people implicated in Epstein files are still getting influential positions with explicit "it would be cancel culture to not give these people more influence" argument.

amelius 7 hours ago | parent | prev | next [-]

I think the reasoning is that the AI contributes to more epsteins. In some way.

terminalshort 6 hours ago | parent | next [-]

That isn't reasoning, it's wild speculation

amelius 5 hours ago | parent [-]

I seem to remember there was research behind this, but I'm not sure.

joe_mamba 7 hours ago | parent | prev [-]

How?

THat's like the 1993 moral panic that video games like Doom cause mass shootings, or the 1980's mass panic that metal music causes satanist, or the 1950s moral panic that superhero comic book violence leads to juvenile delinquency. Politicians are constantly looking for an external made up enemy to divert attention to from the real problems.

People like Epstein and mass woman/child exploitation have existed for thousands of years in the past, and will exist thousands of years in the future. It's part of the nature of the rich and powerful to execute on their deranged fetishes, it's been documented in writing since at least the Roman and Ottoman empires.

Hell, I can guarantee you there's other Epsteins operating in the wild right now, that we haven't heard of (yet), it's not like he was in any way unique. I can also guarantee you that 1 in 5-10 normal looking people you meet daily on the street have similar deranged desires as the guests on Epstein's island but can't execute on them because they're not as rich and influential to get away with it, but they'd do it if they could.

KaiserPro 7 hours ago | parent | next [-]

> THat's like the 1993 moral panic that video games like Doom cause mass shootings,

Apart from doom wasn't producing illegal content.

the point is that grok is generating illegal content for those jurisdictions. In france you can't generate CSAM, in the UK you can't distribute CSAM. Those are actual laws with legal tests, none of them need to be of actual people, they just need to depict _children_ to be illegal.

Moral panics require new laws to enforce, generally. This is just enforcing already existing laws.

More over, had it been any other site, it would have been totally shut down by now and the servers impounded. Its only because musk is close to trump and rich that he's escaped the fate than you or I would have had if we'd done the same.

joe_mamba 6 hours ago | parent [-]

>Apart from doom wasn't producing illegal content.

Sure but where's the proof that Grok is actually producing illegal content? I searched for news sources, but they're just all parroting empty accusations not concrete documented cases.

pasc1878 5 hours ago | parent | next [-]

See https://www.bbc.co.uk/news/articles/cvg1mzlryxeo

Note that IWF is not a random charity it works with the Police on these matters.

I found this as the first item in Kagi search - perhaps you should try non AI searches

KaiserPro 5 hours ago | parent | prev [-]

> but they're just all parroting empty accusations not concrete documented cases.

In the UK it is illegal to create, distribute and store CSAM. A news site printing a photo CSAM would make them legally up the shitter.

However, the IWF, who are tasked with detecting this stuff have claimed to have found evidence of it, along with multiple other sources, Ofcom who are nominally supposed to police this have an open investigation, so do the irish police.

The point is, law has a higher threshold of proof than news, which takes time. If there is enough evidence, then a court case (or other instrument) will be invoked.

amelius 7 hours ago | parent | prev | next [-]

Another line of reasoning is that with more fake CP it is more difficult to research the real CP hunt down the perpetrators and consequently save children.

joe_mamba 7 hours ago | parent [-]

Oh yeah, because the main reason why EPstein and his guests got away with it for so long, is because there was so much low hanging CP out there confusing authorities and prosecutors, not because of the corruption, cronyism and political protection they enjoyed at the highest levels of government.

Do you guys even hear yourselves?

amelius 7 hours ago | parent [-]

But how about the "1 in 5-10 normal looking people you meet daily on the street have similar deranged desires as the guests on Epstein's island but can't execute on them because they're not as rich and influential to get away with it, but they'd do it if they could."

Some of those might still try.

joe_mamba 7 hours ago | parent [-]

>Some of those might still try.

And what does AI have to do with this? Haven't child predators existed before AI?

Where's the proof that AI produces more child predators?

You're just going in circles without any arguments.

amelius 6 hours ago | parent [-]

It has to do with AI because:

> Another line of reasoning is that with more fake CP it is more difficult to research the real CP hunt down the perpetrators and consequently save children.

(own quote)

Yes, the predators existed before AI, but also:

> I think the reasoning is that the AI contributes to more offenders (edited).

(own quote, edited)

To be clear, I don't think this line of reasoning is entirely convincing, but apparently some people do.

watwut 5 hours ago | parent | prev [-]

No, 20% of population is not seeking to abuse children nor teens. If you think so, you are moving in weird circles. In fact, what we also have are people who noped out of Epstein circle or even openly criticized it for years.

Also, framing the issue of sexual abuse by untouchable issue as the same as superhero comic issue (which itself was not just about superhero comic and you should know it) is spectacularly bad faith.

Yes, there were always people who were stealing, abusing, murdering for own gain and fun. That is not an argument for why we should accept and support it as normalized state of world. It is a good reason to prevent people from becoming too powerful and for building accountable institutions able to catch and punish them.

ilogik 7 hours ago | parent | prev | next [-]

The UK is also opening investigations into the Esptein stuff.

https://www.reuters.com/world/uk/starmers-government-aids-po...

Unlike the US administration which seems to be fine with what epstein and X are doing

GordonS 7 hours ago | parent | next [-]

Except Starmer is making sure that the "investigation" is hobbled - anything seemed important to "national security" will be excluded!

The UK's "investigation" is a farce.

joe_mamba 7 hours ago | parent | prev [-]

[flagged]

SiempreViernes 7 hours ago | parent | next [-]

What's this comment about? Do you think no other CSAM distribution should be investigated until the stuff in Epstein files is sorted?

owebmaster 6 hours ago | parent | prev [-]

[flagged]

owebmaster 6 hours ago | parent | prev | next [-]

The same guy responsible for creating child porn that you are defending is also in the Epstein's list. Also, don't abbreviate child pornography, it shows you have a side on this

cess11 7 hours ago | parent | prev [-]

"Grok" is part of the Epstein network, connected through Elon Musk.

tick_tock_tick 6 hours ago | parent | prev [-]

No need to be coy the raid exists because it's a way to punish the company without proving anything. They have zero intention of getting even the slightest bit of valuable data related to Grok from this.

fyredge 34 minutes ago | parent | next [-]

Unlike the current American administration who condones raids on homes without warrants and justifies violence with lies, this France raid follows something called rule of law.

So no, don't be coy and pretend that all governments are like American institutions.

direwolf20 6 hours ago | parent | prev [-]

What's your evidence?