| ▲ | scott_w 7 hours ago | |||||||||||||||||||||||||||||||||||||||||||||||||
Did you miss the numerous news reports? Example: https://www.theguardian.com/technology/2026/jan/08/ai-chatbo... For obvious reasons, decent people are not about to go out and try to general child sexual abuse material to prove a point to you, if that’s what you’re asking for. | ||||||||||||||||||||||||||||||||||||||||||||||||||
| ▲ | cubefox 7 hours ago | parent [-] | |||||||||||||||||||||||||||||||||||||||||||||||||
First of all, the Guardian is known to be heavily biased again Musk. They always try hard to make everything about him sound as negative as possible. Second, last time I tried, Grok even refused to create pictures of naked adults. I just tried again and this is still the case: https://x.com/i/grok/share/1cd2a181583f473f811c0d58996232ab The claim that they released a tool with "seemingly no guardrailes" is therefore clearly false. I think what instead has happened here is that some people found a hack to circumvent some of those guardrails via something like a jailbreak. | ||||||||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||||||||