Remix.run Logo
madeofpalk 6 hours ago

Let’s take a step back and remove AI generation from the conversation for a moment.

Did X do enough to prevent its website being used to distribute illegal content - consensual sexual material of both adults and children?

Now reintroduce AI generation, where X plays a more active role in facilitating the creation of that illegal content.

muyuu 6 hours ago | parent [-]

"Enough" can always be pushed into the impossible. That's why laws and regulations need to be more concrete than that.

There's essentially a push to end the remnants of the free speech Internet by making the medium responsible for the speech of its participants. Let's not pretend otherwise.

KaiserPro 5 hours ago | parent [-]

The law is concrete on this.

In the UK, you must take "reasonable" steps to remove illegal content.

This normally means some basic detection (ie fingerprinting which is widely used from a collaborative database) or if a user is consistently uploading said stuff, banning them.

Allowing a service that you run to continue to generate said illegal content, even after you publicly admit that you know its wrong, is not reasonable.

muyuu 5 hours ago | parent [-]

that doesn't sound concrete to me, at all

KaiserPro 4 hours ago | parent | next [-]

Nothing in common law is "concrete", thats kinda the point of it.

Judges can evolve and interpret as they see fit, and this evolution is case law.

This is why in the US the supreme court can effectively change the law by issuing a binding ruling. (see 2nd amendment meaning no gun laws, rather than as written, or the recent racial profiling issues)

direwolf20 4 hours ago | parent | prev | next [-]

No law is concrete. Murder is killing with intent to kill. What concrete test shows if someone intended to kill? They say you have intent to kill if a reasonable person would expect the actions you took would result in killing.

disgruntledphd2 5 hours ago | parent | prev [-]

It's about as concrete as one gets in the UK/US/Anglosphere law tradition.

muyuu 5 hours ago | parent [-]

if you can be sued for billions because some overbearing body, with a very different ideology to yours, can deem your moderation/censorship rules to be "unreasonable" then what you do is err on the side of caution and allow nearly nothing

this is not compatible with that line of business - perhaps one of the reasons nothing is done in Europe these days

disgruntledphd2 an hour ago | parent | next [-]

> this is not compatible with that line of business - perhaps one of the reasons nothing is done in Europe these days

Except for 40% of all Big Tech products and a vast industrial network of companies, and the safe airplane building and decent financial services that don't take 3% of everything, then yeah, I guess nothing is done in Europe these days.

And wait, wasn't most of Google's AI stuff acquired from a European country?

Honestly, while Europe has a lot of problems, this notion that many US people have that literally nothing happens there is wildly off-base.

muyuu 31 minutes ago | parent [-]

https://i.postimg.cc/vBhVsvFN/image.png

direwolf20 4 hours ago | parent | prev | next [-]

They advertised you could use the tool to undress people, that's pretty clearly on the unreasonable side of the line

KaiserPro 4 hours ago | parent | prev [-]

sigh

The vast majority of the EU is not common law, so "reasonable" in this instance is different.

What you describe already happens in the USA, that why MLB has that weird local TV blackout, why bad actors use copyright to take down content they don't like.

The reason why its so easy to do that is because companies must reasonably comply with copyright holder's requests.

Its the same with CSAM, distributing it doesn't have first amendment protection, knowingly distributing it is illegal. All reasonable steps should be taken to detect and remove CSAM from your systems to qualify for safe harbour.

muyuu 4 hours ago | parent [-]

sigh indeed

> Its the same with CSAM, distributing it doesn't have first amendment protection, knowingly distributing it is illegal. All reasonable steps should be taken to detect and remove CSAM from your systems to qualify for safe harbour.

nice try, but nobody is distributing or hosting CSAM in the current conversation

people trying to trick a bot to post bikini pictures of preteens and blaming the platform for it is a ridiculous stretch to the concept of hosting CSAM, which really is a transparent attack to a perceived political opponent to push for a completely different model of the internet to the pre-existing one, a transition that is as obvious as is already advanced in Europe and most of the so-called Anglosphere

> The vast majority of the EU is not common law, so "reasonable" in this instance is different.

the vast majority of the EU is perhaps incompatible with any workable notion of free speech, so perhaps America will have to choose whether it's worth it to sanction them into submission, or cut them off at considerable economic loss

it's not a coincidence that next to nothing is built in Europe these days, the environment is one of fear and stifling regulation and if I were to actually release anything in either AI or social networks I'd do what most of my fellow Brits/Europoors do already, which is to either sell to America or flee this place before I get big enough to show up in the euro-borg's radar

KaiserPro 3 hours ago | parent [-]

> nice try, but nobody is distributing or hosting CSAM in the current conversation

multiple agencies (Ofcom, irish police IWF, and what ever the french regulator is) have detected CSAM.

You may disagree with that statement, but bear in mind the definition of CSAM in the UK is "depiction of a child" which means that if its of a child or entirely generated is not relevant. This was to stop people claiming that massive cache of child porn they had was photoshoped.

in the USA CSAM is equally vaguely defined, but the case law is different.

> EU is perhaps incompatible with any workable notion of free speech

I mean the ECHR definition is fairly robust. But given that first amendment protection has effectively ended in the USA (the president is currently threatening to take a comedian to court for making jokes, you know, like the twitter bomb threat person in the UK) its a bit rich really. The USA is not the bastion of free speech it once was.

> either sell to America or flee this place before I get big enough to show up in the euro-borg's radar

Mate, as someone whos sold a startup to the USA, its not about regulations its about cold hard fucking cash. All major companies comply with EU regs, and its not hard. they just bitch about them so that the USA doesn't put in basic data protection laws, so they can continue to be monopolies.