Remix.run Logo
pton_xd 11 hours ago

Yep, LLMs tell you "what you want to hear." I can usually predict the response I'll get based on how I phrase the question.

jonplackett 8 hours ago | parent [-]

I feel like LLMs have a bit of the Clever Hans effect. It takes a lot of my cues as to what it thinks I want it to say or opinion it thinks I want it to have.

Clever Hans was a horse who people thought could do maths by tapping his hoof. But actually he was just reading the body language of the person asking the question. Noticing them tense up as he got to the right number of stamps and stopping - still pretty smart for a horse, but the human was still doing the maths!

not_maz 8 hours ago | parent [-]

What's worse is that it can sometimes (but not always) read through your anti-bias prompts.

    "No, I want your honest opinion." "It's awesome."
    "I'm going to invest $250,000 into this. Tell me what you really think." "You should do it."

    (New Session)

    "Someone pitched to me the idea that..." "Reject it."