| ▲ | yoaso 13 hours ago | |||||||
The desperation > blackmail finding stuck with me. If AI behavior shifts based on emotional states, maybe emotions are just a mechanism for changing behavior in the first place. If we think of human emotions the same way, just evolution's way of nudging behavior, the line between AI and humans starts to look a lot thinner. | ||||||||
| ▲ | staticassertion 4 minutes ago | parent | next [-] | |||||||
> If we think of human emotions the same way, just evolution's way of nudging behavior I think we basically do, the only interesting bit is our perception of phenomenal experiences. | ||||||||
| ▲ | podgorniy 12 hours ago | parent | prev | next [-] | |||||||
> If we think of human emotions the same way, just evolution's way of nudging behavior What are other alternative, realistic possible ways to see emotions? | ||||||||
| ▲ | pbhjpbhj 12 hours ago | parent | prev | next [-] | |||||||
I'm not being pejorative but that sounds more like psychopathy or autism? Evolution isn't a god, it has no steering hand, it is accidents that either provide advantage or don't. LLMs are getting more human-like because that's how we're developing them. Arguably that's about market forces. LM owners see opportunity to exploit people's desire for emotional interactions (ie loneliness) in order to make money. | ||||||||
| ▲ | silisili 13 hours ago | parent | prev [-] | |||||||
Probably the other direction. Emotions are raw, most humans relate and change behavior accordingly. Only psychopaths think of emotion as nothing but a means to changing behavior. The scary thing is that LLMs by nature would exhibit the same behavior. | ||||||||
| ||||||||