| ▲ | scott_w 3 hours ago | |
> That post doesn't contain such an admission, it instead talks about forbidden prompting. In response to what? If CSAM is not being generated, why aren't X just saying that? Instead they're saying "please don't do it." > which contradicts your claim that there were no guardrails before. From the linked post: > However content is created or whether users are free or paid subscribers, our Safety team are working around the clock to add additional safeguards Which was posted a full week after the initial story broke and after Ofcom started investigative action. So no, it does not contradict my point which was: > Also the reason you can’t make it generate those images is because they implemented safeguards since that article was written: As you quoted. I really can't decide if you're stupid, think I and other readers are stupid, or so dedicated to defending paedophilia that you'll just tell flat lies to everyone reading your comment. | ||
| ▲ | cubefox 27 minutes ago | parent [-] | |
Leave your accusations for yourself. Grok already didn't generate naked pictures of adults months ago when I tested it for the first time. Clearly the "additional safeguards" are meant to protect the system against any jailbreaks. | ||