▲ | daveguy 8 hours ago | |
Maybe the role reversal breaks most of the RLHF training. The training was definitely not done in the context of role reversal, so it could be out of distribution. If so, this is a glimpse of the intelligence of the LLM core without the RL/RAG/etc tape and glue layers. |