| ▲ | scott_w 8 hours ago | |||||||||||||||||||||||||
For more evidence: https://www.bbc.co.uk/news/articles/cvg1mzlryxeo Also, X seem to disagree with you and admit that CSAM was being generated: https://arstechnica.com/tech-policy/2026/01/x-blames-users-f... Also the reason you can’t make it generate those images is because they implemented safeguards since that article was written: https://www.ofcom.org.uk/online-safety/illegal-and-harmful-c... This is because of government pressure (see Ofcom link). I’d say you’re making yourself look foolish but you seem happy to defend nonces so I’ll not waste my time. | ||||||||||||||||||||||||||
| ▲ | cubefox 5 hours ago | parent [-] | |||||||||||||||||||||||||
> Also, X seem to disagree with you and admit that CSAM was being generated That post doesn't contain such an admission, it instead talks about forbidden prompting. > Also the reason you can’t make it generate those images is because they implemented safeguards since that article was written: That article links to this article: https://x.com/Safety/status/2011573102485127562 - which contradicts your claim that there were no guardrails before. And as I said, I already tried it a while ago, and Grok also refused to create images of naked adults then. | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||