Remix.run Logo
pyrale an hour ago

No, the point is that saying sorry because you're genuinely sorry is different from saying sorry because you expect that's what the other person wants to hear. Everybody does that sometimes but doing it every time is an issue.

In the case of LLMs, they are basically trained to output what they predict an human would say, there is no further meaning to the program outputting "sorry" than that.

I don't think the comparison with people with psychopathy should be pushed further than this specific aspect.

BoredPositron an hour ago | parent [-]

You provided the logical explanation why the model acts like it does. At the moment it's nothing more and nothing less. Expected behavior.

lazide 15 minutes ago | parent [-]

Notably, if we look at this abstractly/mechanically, psychopaths (and to some extent sociopaths) do study and mimic ‘normal’ human behavior (and even the appearance of specific emotions) to both fit in, and to get what they want.

So while internally (LLM model weight stuff vs human thinking), the mechanical output can actually appear/be similar in some ways.

Which is a bit scary, now that I think about it.