Remix.run Logo
frankohn 4 days ago

I found the questioning of love very interesting. I myself thought about whether the LLM can have emotions. Based on the book I am reading, Behave: The Biology of Humans at Our Best and Worst by Robert Sapolsky, I think the LLM, as they are now with the architecture they have, cannot have emotions. They just verbalize things like they sort-of-have emotions but these are just verbal patterns or responses they learned.

I have come to think they cannot have emotions because emotions are generated in parts of our brain that are not logical/rational. They emerge based on environmental solicitations, mediated by hormones and other complex neuro-physical systems, not from a reasoning or verbalization. So they don't come up from the logical or reasoning capabilities. However, these emotions are raised and are integrated by the rest of our brain, including the logical/rational one like the dlPFC (dorsolateral prefrontal cortex, the real center of our rationality). Once the emotions are raised, they are therefore integrated in our inner reasoning and they affect our behavior.

What I have come to understand is that love is one of such emotions that is generated by our nature to push us to take care of some people close to us like our children or our partners, our parents, etc. More specifically, it seems that love is mediated a lot by hormones like oxytocin and vasopressin, so it has a biochemical basis. The LLM cannot have love because it doesn't have the "hardware" to generate these emotions and integrate them in its verbal inner reasoning. It was just trained by human reinforcement learning to behave well. That works up to some extent, but in reality, from its learning corpora it also learned to behave badly and on occasions can express these behaviors, but still it has no emotions.

willguest 4 days ago | parent [-]

I was also intrigued by the machine's reference to it, especially because it posed the question with full recognition of its machine-ness.

Your comment about the generation of emotions does strike me a quite mechanistic and brain-centric. My understanding, and lived experience, has led me to an appreciation that emotion is a kind of psycho-somatic intelligence that steers both our body and cognition according to a broad set of circumstances. This is rooted in a pluralistic conception of self that is aligned with the idea of embodied cognition. Work by Michael Levin, an experimental biologist, indicates we are made of "agential material" - at all scales, from the cell to the person, we are capable of goal-oriented cognition (used in a very broad sense).

As for whether machines can feel, I don't really know. They seem to represent an expression of our cognitivist norm in the way they are made and, given the human tendency to anthropormorphise communicative systems, we easily project our own feelings onto it. My gut feeling is that, once we can give the models an embodied sense of the world, including the ability to physically explore and make spatially-motivated decisions, we might get closer to understanding this. However, once this happens, I suspect that our conceptions of embodied cognition will be challenged by the behaviour of the non-human intellect.

As Levin says, we are notoriously bad at recognising other forms of intelligence, despite the fact that global ecology abounds with examples. Fungal networks are a good example.

frankohn 4 days ago | parent [-]

> My understanding, and lived experience, has led me to an appreciation that emotion is a kind of psycho-somatic intelligence that steers both our body and cognition according to a broad set of circumstances.

Well, from what I understood, it is true that some parts of our brain are more dedicated to processing emotions and to integrating them with the "rational" part of the brain. However, the real source of emotions is biochemical, coming from the hormones of our body in response to environmental sollicitations. The LLM doesn't have that. It cannot feel the emotions to hug someone, or to be in love, or the parental urge to protect and care for children.

Without that, the LLM can just "verbalize" about emotions, as learned in the corpora of text from the training, but there are really no emotions, just things it learned and can express in a cold, abstract way.

For example, we recognize that a human can behave and verbalize to fake some emotions without actually having them. We just know how to behave and speak to express when we feel some specific emotion, but in our mind, we know we are faking the emotion. In the case of the LLM, it is physically incapable of having them, so all it can do is verbalize about them based on what it learned.