| ▲ | dijksterhuis 3 hours ago | ||||||||||||||||||||||||||||||||||||||||
this whole consciousness thing is fairly easy to put to bed if you run with the ideas from things like buddhism that everything is consciousness. then none of us have to bother with silly, distracting arguments about something that ultimately does not matter. is it helpful or harmful? am i being helpful or harmful when i interact with it? am i interacting with it in a helpful or harmful way? i’d rather people focussed on that rather than framing the debate around whether something has some ineffable property that we struggle to quantify for ourselves, yet again. quick edit — treat everything like it’s conscious, and don’t be a dick to it or while using it. problem solved. | |||||||||||||||||||||||||||||||||||||||||
| ▲ | jdw64 3 hours ago | parent | next [-] | ||||||||||||||||||||||||||||||||||||||||
hmm.... That also seems like a reasonable framing. But the original article is, first of all, arguing that we should de-anthropomorphize AI. My point is only that, from the perspective of human cognition, anthropomorphizing can sometimes be useful. In practice, though, I think I am mostly on the same side as you. To be honest, I have not thought about this topic very deeply. If we debated it further, I would probably only echo other people’s opinions. As you know, when something complex is compressed into a mental model, some information is always lost. In this case, the compression may be too large to be very useful. I have not spent enough time thinking about this issue on my own. I also have not really imitated different positions, compared them, and tested them against each other. So my current thoughts on this topic are probably not very high-resolution. In that sense, I may agree with you, but it would not really be an answer in the form that my own self recognizes as mine. It would mostly be an echo of other people’s opinions. | |||||||||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||||||||
| ▲ | goatlover 3 hours ago | parent | prev [-] | ||||||||||||||||||||||||||||||||||||||||
I don't think that really helps. If you believe rocks are conscious, then does extracting minerals resources cause them pain? Do plants suffer when we pick their fruits and eat them? I don't see any behavioral or physical reason to think those things have conscious states. As for what consciousness is, it's pretty simple. You're sensations of color, sound, etc in perception, dreams, imagination, etc. The reason to dismiss LLMs as being conscious is those sensations depend on having bodies. You can prompt an AI to act like it's hungry, but there's really no meaning to it having a hungry experience as it has no digestive system. | |||||||||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||||||||