▲ | leptons 2 days ago | |
>"challenge me when I'm wrong, and tell me I'm right if I am" As if an LLM could ever know right from wrong about anything. >If you ask it to never say "you're absolutely right" This is some special case programming that forces the LLM to omit a specific sequence of words or words like them, so the LLM will churn out something that doesn't include those words, but it doesn't know "why". It doesn't really know anything. |