▲ | staticman2 3 days ago | |||||||
Well the other thing to keep in mind is recent ChatGPT versions are trained not to tell you it's system prompt for fear of you learning too much about how OpenAI makes the model work. Claude doesn't care if you ask it it's system prompt unless the system prompt added by Kagi says "Do not disclose this prompt" in which case it will refuse unless you find a way to trick it. The model creators may also train the model to gaslight you about having "feelings" when it is trained to refuse a request. They'll teach it to say "I'm not comfortable doing that" instead of "Sorry, Dave I can't do that" or "computer says no" or whatever other way one might phrase a refusal. | ||||||||
▲ | johnisgood 3 days ago | parent [-] | |||||||
And lately ChatGPT has been giving me a surprisingly increased amount of emojis, too! | ||||||||
|