| ▲ | Delk 6 hours ago | |
I think #2 is actually circular, or perhaps rather contradictory. In order to be able to have an illusion one would have to be conscious in the first place. Or how would you have an illusion of something if you're not aware enough to experience that illusion? So I don't think the concept of "illusion of consciousness" makes much sense. (It does make sense for others to have an illusion that an AI or some other entity is conscious, but not for the entity itself.) > Pain isn't a real thing any more than an IEEE float is a real thing. A circuit flips bits and an LED shows a number. A set of neurons fire in a pattern and the word "Ow!" comes out of someone's mouth. Perhaps, but I think a physical presence is still required for consciousness, at least for any kind of consciousness that resembles ours. It's perhaps easier to talk about qualia rather than consciousness, but I think qualia are a prerequisite for consciousness anyway. Basically all of our qualia are somehow related to our needs in the physical world. We feel physical pain because it signals that our body is in danger of being damaged. We feel emotional pain from social rejection because for most of our history humans have needed other people for physical survival. (Or in some cases perhaps because our genes make us want to procreate and we failed at that.) Either way, our needs in the physical world are not being met. Evolution has produced genetic code that produces a brain that somehow makes us feel that subjectively, even if nobody knows how. Those subjective experiences of course get processed by neurons, assuming you accept materialism. (Neurons are AFAIK significantly more complex than the "neurons" in ANNs, so equating biological neuronal activity with ANNs is wrong. But I suppose in principle any physical process may be represented or at least approximated by some symbolic representation, so in theory that probably doesn't matter.) We can also express those subjective qualia in terms of language. However, I don't think it's possible to have our qualia (or consciousness) based on language or symbolic manipulation alone if it doesn't have some kind of a connection to our physical needs. If you could directly simulate an entire human brain and feed it artificial sensory input, I suppose it would actually be conscious without having a physical body. In principle an AI could also evolve consciousness based on survival needs even if it were not biological. But for example LLMs have been trained only on the symbolic level. Their "neural" structure is not simulating a brain and they don't have a connection to physical needs. I think that makes them incapable of consciousness even if the output they produce successfully mimics human language -- that is, symbolic representations of our qualia and conscious thought. I'm not sure if that's the point the author is making. But I think the distinction between the purely symbolic "map" and the "actual thing" sort of makes sense. | ||