Remix.run Logo
gadders 6 hours ago

[flagged]

pjc50 6 hours ago | parent | next [-]

The different factors are scale (now "deepfakes" can be automatically produced), and endorsement. It is significant that all these images aren't being posted by random users, they are appearing under the company's @grok handle. Therefore they are speech by X, so it's X that's getting raided.

mnewme 5 hours ago | parent | prev [-]

There is no content like that on Bluesky nor Mastadon. Show the evidence

GaryBluto 4 hours ago | parent | next [-]

> There is no content like that on [...] Mast(o)don.

How can you say that nobody is posting CSAM on a massive decentralized social network with thousands of servers?

Zenst 4 hours ago | parent | prev [-]

https://bsky.social/about/blog/01-17-2025-moderation-2024

"In 2024, Bluesky submitted 1,154 reports for confirmed CSAM to the National Centre for Missing and Exploited Children (NCMEC). Reports consist of the account details, along with manually reviewed media by one of our specialized child safety moderators. Each report can involve many pieces of media, though most reports involve under five pieces of media."

If it wasn't there, there would be no reports.

mnewme 4 hours ago | parent [-]

But that is the difference, they actually do something against it.

Zenst 3 hours ago | parent [-]

https://blog.x.com/en_us/topics/company/2023/an-update-on-ou...