Remix.run Logo
BoredPositron 3 hours ago

So if you make a mistake and say sorry you are also a psychopath?

ludwik 2 hours ago | parent | next [-]

I think the point of comparison (whether I agree with it or not) is someone (or something) that is unable to feel remorse saying “I’m sorry” because they recognize that’s what you’re supposed to do in that situation, regardless of their internal feelings. That doesn’t mean everyone who says “sorry” is a psychopath.

BoredPositron 2 hours ago | parent [-]

We are talking about an LLM it does what it has learned. The whole giving it human ticks or characteristics when the response makes sense ie. saying sorry is a user problem.

ludwik an hour ago | parent [-]

Okay? I specifically responded to your comment that the parent comment implied "if you make a mistake and say sorry you are also a psychopath", which clearly wasn’t the case. I don’t get what your response has to do with that.

pyrale an hour ago | parent | prev | next [-]

No, the point is that saying sorry because you're genuinely sorry is different from saying sorry because you expect that's what the other person wants to hear. Everybody does that sometimes but doing it every time is an issue.

In the case of LLMs, they are basically trained to output what they predict an human would say, there is no further meaning to the program outputting "sorry" than that.

I don't think the comparison with people with psychopathy should be pushed further than this specific aspect.

BoredPositron an hour ago | parent [-]

You provided the logical explanation why the model acts like it does. At the moment it's nothing more and nothing less. Expected behavior.

lazide 14 minutes ago | parent [-]

Notably, if we look at this abstractly/mechanically, psychopaths (and to some extent sociopaths) do study and mimic ‘normal’ human behavior (and even the appearance of specific emotions) to both fit in, and to get what they want.

So while internally (LLM model weight stuff vs human thinking), the mechanical output can actually appear/be similar in some ways.

Which is a bit scary, now that I think about it.

camillomiller 2 hours ago | parent | prev [-]

Are you smart people all suddenly imbeciles when it comes to AI or is this purposeful gaslighting because you’re invested in the ponzi scheme? This is a purely logical problem. comments like this completely disregard the fallacy of comparing humans to AI as if a complete parity is achieved. Also the way this comments disregard human nature is just so profoundly misanthropic that it just sickens me.

BoredPositron 2 hours ago | parent [-]

No but the conclusions in this thread are hilarious. We know why it says sorry. Because that's what it learned to do in a situation like that. People that feel mocked or are calling an LLM psychopath in a case like that don't seem to understand the technology either.

camillomiller an hour ago | parent [-]

I agree, psychopath is the wrong adjective, I agree. It refers to an entity with a psyche, which the illness affects. That said, I do believe the people who decided to have it behave like this for the purpose of its commercial success are indeed the pathological individuals. I do believe there is currently a wave of collective psychopathology that has taken over Silicon Valley, with the reinforcement that only a successful community backed by a lot of money can give you.