▲ | Aeolos 4 days ago | |
> I very much think that AIs with minds are possible The real question here is how would _we_ be able to recognize that? And would we even have the intellectual honesty to be able to recognize that, when at large we seem to be inclined to discard everything non-human as self-evidently non-intelligent and incapable of feeling emotion? Let's take emotions as a thought experiment. We know that plants are able to transmit chemical and electrical signals in response to various stimuli and environmental conditions, triggering effects in themselves and other plants. Can we therefore say that plants feel emotions, just in a way that is unique to them and not necessarily identical to a human embodiment? The answer to that question depends on one's worldview, rather than any objective definition of the concept of emotion. One could say plants cannot feel emotions because emotions are a human (or at least animal) construct; or one could say that plants can feel emotions, just not exactly identical to human emotions. Now substitute plants with LLMs and try the thought experiment again. In the end, where one draws the line between `human | animal | plant | computer` minds and emotions is primarily a subjective philosophical opinion rather than rooted in any sort of objective evidence. Not too long ago, Descartes was arguing that animals do not possess a mind and cannot feel emotions, they are merely mimicry machines.[1] More recently, doctors were saying similar things about babies and adults, leading to horrifying medical malpractice.[2][3] Because in the most abstract sense, what is an emotion if not a set of electrochemical stimuli linking a certain input to a certain output? And how can we tell what does and what does not possess a mind if we are so undeniably bad at recognize those attributes even within our own species? [1] https://en.wikipedia.org/wiki/Animal_machine | ||
▲ | jibal 3 days ago | parent [-] | |
> The real question here No True Scotsman fallacy. Just because that interests you doesn't mean that it's "the real question". > would we even have the intellectual honesty Who is "we"? Some would and some wouldn't. And you're saying this in an environment where many people are attributing consciousness to LLMs. Blake Lemoine insisted that LaMDA was sentient and deserved legal protection, from his dialogs with it in which it talked about its friends and family -- neither of which it had. So don't talk to me about intellectual honesty. > Can we therefore say that plants feel emotions Only if you redefine emotions so broadly--contrary to normal usage--as to be able to make that claim. In the case of Strong AI there is no need to redefine terms. > Now substitute plants with LLMs and try the thought experiment again. Ok: "We know that [LLMs] are able to transmit chemical and electrical signals in response to various stimuli and environmental conditions, triggering effects in themselves and other [LLMs]." Nope. "In the end, where one draws the line between `human | animal | plant | computer` minds and emotions is primarily a subjective philosophical opinion rather than rooted in any sort of objective evidence." That's clearly your choice. I make a more scientific one. "Because in the most abstract sense, what is an emotion if not a set of electrochemical stimuli linking a certain input to a certain output?" It's something much more specific than that, obviously. By that definition, all sorts of things that any rational person would want to distinguish from emotions qualify as emotions. Bowing out of this discussion on grounds of intellectual honesty. |