| ▲ | cyrusradfar 3 days ago |
| I think the challenge with many of these conversations is that they assume consciousness emerges through purely mechanical means. The “brain as a computer” metaphor has been useful in limited contexts—especially for modeling memory or signal processing; but, I don’t think it helps us move forward when talking about consciousness itself. Penrose and Hameroff’s quantum consciousness hypothesis, while still very speculative, is interesting precisely because it suggests that consciousness may arise from phenomena beyond classical computation. If that turns out to be true, it would also mean today’s machines—no matter how advanced—aren’t on a path to genuine consciousness. That said, AI doesn’t need to think to be transformative. Steam engines weren’t conscious either, yet they reshaped civilization. Likewise, AI and robotics can bring enormous value without ever approaching human-level awareness. We can hold both ideas at once: that machines may never be conscious, and still profoundly useful. |
|
| ▲ | roommin 3 days ago | parent | next [-] |
| The tendency to attribute consciousness to the quantum is one I find very grating. What makes the human brain any less mechanical if quantum mechanics dictate the firing of neurons rather than electrodynamics? Why does the wave nature of subatomic systems mean that an artificial tongue would suddenly be able to subjectively experience taste? It always reads to me as very wooy, and any amount of drilling leads to even more questions that seem to take the ideas further from reality. I think the largest case for consciousness being a mechanical system is the fact that we can interface with it mechanically. We can introduce electricity, magnetic fields, chemicals, and scalpels to change the nature of peoples experience and consciousness. Why is the incredible complexity of our brains an insufficient answer and that a secret qbit microtube in each neuron is a more sound one? |
| |
| ▲ | ACCount37 3 days ago | parent [-] | | Quantum effects are weird, and poorly understood, and are just about the only thing in the known universe that isn't deterministic. Human mind is weird, and poorly understood, and isn't deterministic - or, at least, most humans like to think that it isn't. No wonder the two are intuitively associated. The two kinds of magic fairy dust must have the same magic at their foundation! |
|
|
| ▲ | myrmidon 3 days ago | parent | prev | next [-] |
| > they assume consciousness emerges through purely mechanical means. From my view, all the evidence points in exactly that direction though? Our consciousness can be suspended and affected by purely mechanical means, so clearly much of it has to reside in the physical realm. Quantum consciousness to me sounds too much like overcomplicating human exceptionalism that we have always been prone to, just like geocentrism or our self-image as the apex of creation in the past. |
| |
| ▲ | CuriouslyC 3 days ago | parent [-] | | Your memory formation gets inhibited and you become unresponsive under anesthesia. The brain still processes information. Let's take a step back from the "how" and talk about the what. The fundamental dichotomy is emergent consciousness versus panpsychism. The irony is that even though panpsychism is seen as more fringe (because materialists won, smh), it's actually the explanation preferred by Occam's razor. Emergent consciousness needs a mechanism of emergence as well as separate dimensions of consciousness and matter, whereas panpsychism is good as is. To go one step farther, idealism simplifies a lot of the weirdness around panpsychism. It's a strange world to live in where the elegant worldview that answers difficult problems cleanly is marginalized by an epicycle-laden one that creates paradoxes just because the elegant view refutes the dominant religious paradigm and anthropocentrism. | | |
| ▲ | antonvs 2 days ago | parent [-] | | Panpsychism doesn’t explain anything, it just asserts that consciousness doesn’t have an explanation, that it just “is”. It’s not impossible that something like panpsychism could be true, but knowing that wouldn’t get us any closer to understanding consciousness. It also raises more questions than it answers, such as how an integrated consciousness arises within a brain/mind, whereas it presumably doesn’t in, say, a hamburger patty. Ironically, attempts to explain that start to hint that such an explanation might not need to rely on panpsychism in the first place - i.e. if you can explain how consciousness arises from a sum of parts, you may not need to postulate that it exists independently of that combination of parts. | | |
| ▲ | CuriouslyC 2 days ago | parent [-] | | Those questions you mentioned apply across the board, just in nuanced variants. Do you really think that postulating a non-physical system that we can't describe in physical terms (red is not a wavelength), somehow magically creates a new dimension of "feeling" when the bits are arranged in the "right order" is less complex than the hypothesis consciousness forms arranges itself into "structures" in much the same way as matter does? As for explaining consciousness, we can't even prove consciousness exists, so the thought of trying to explain "what" it is seems rather premature, but then that's humans for ya. | | |
| ▲ | myrmidon 2 days ago | parent | next [-] | | I honestly don't see what the whole framework gets you. Red (or all qualia) is just the reaction of your nervous system to a stimulus. Since that reaction is shaped by common context/associations, the "subjective experience" is quite comparable between similarly raised humans. I think the whole philosophy of mind/subjective experience field is one of the few remaining anti-reductionist hold-outs, but I simply don't see a good enough motivation to stick with that view, especially given the abysmal historical track record for anti-reductionism (just consider early chemistry/alchemy, early biology, astronomy, ...). | | |
| ▲ | CuriouslyC 2 days ago | parent | next [-] | | I'm cool with scientists taking the "shut up and calculate" approach, after all we have to do science, and if you can't do experiments you're doing philosophy. The problem here is the same problem as with quantum hypotheses -- people have forgotten their materialist perspective is an experimental framework and are trying to claim the map is the territory and put the cart before the horse. | |
| ▲ | antonvs a day ago | parent | prev [-] | | Calling that anti-reductionist is misunderstanding the issue. > Red (or all qualia) is just the reaction of your nervous system to a stimulus. Yes, Chalmers would call that one of the easy problems. Computers can do that - react to sensor data, which they have internal representations of - and most people don't assume they're conscious. The hard problem is how you get from that to a conscious experience of those stimuli, which we tend to assume that computers (and LLMs?) don't have. That's not an anti-reductionist position, it's pointing out the fundamental philosophical difficulty in making that leap from non-conscious organizations of matter to conscious ones. Even a hard-core materialist/reductionist who is honest will acknowledge that, assuming they've understood the issue. |
| |
| ▲ | antonvs a day ago | parent | prev [-] | | > Do you really think that postulating a non-physical system that we can't describe in physical terms (red is not a wavelength) There's no mystery about what "red" is - even computers have an internal representation of sensor data, and our minds certainly do as well. "Red" is a representation of some physical state which is also, presumably, physically encoded in the brain. This is what Chalmers classifies as one of the "easy problems" of consciousness - there's no mystery here. The hard problem is that we have a conscious experience of color, along with everything else we're conscious of. Whereas we don't generally assume that a computer executing code such as "if color == red ..." is having a conscious experience while it executes that code. (Although panpsychists may believe that.) > somehow magically creates a new dimension of "feeling" when the bits are arranged in the "right order" is less complex than the hypothesis consciousness forms arranges itself into "structures" in much the same way as matter does? That's not a hypothesis, it's simply handwaving. Both options are, given current knowledge. I wasn't promoting the first option, I was pointing out that if panpsychism requires a theory of how consciousness aggregates, which is similar to what emergence requires in terms of aggregating matter in certain ways, then the whole panpsychist proposal starts seeming like a candidate for Occam's Razor: what is it buying us, other than saying "this can't be explained"? | | |
| ▲ | CuriouslyC a day ago | parent [-] | | Your representation of red is _not red_. It's a dual of the hard problem. There is no theoretical path for going from a wavelength of light to the qualia red, and no amount of information about how the brain and perception work will change that. Just because we can state physical correlates of red perception doesn't change the fundamental difference in kind/dimension which at the heart of the hard problem, and any equivalent problems. Regarding aggregation of consciousness, I think panpsychism buys us an actual experimental paradigm here, and I think people have already been exploring it without realizing it. I'm talking about split brain and "multi-mind" research showing that people have multiple consciousnesses that each take over and drive under different circumstances. The idea that there are multiple separate aggregates in the brain at once that hand off driving the body moment to moment makes total sense under panpsychism, but is a little weird for emergent consciousness theories. |
|
|
|
|
|
|
| ▲ | drdaeman 3 days ago | parent | prev | next [-] |
| > consciousness may arise from phenomena beyond classical computation Sapolsky addresses this in “Determined”, arguing that quantum effects don’t bubble up enough to alter behavior significantly enough. |
|
| ▲ | wry_discontent 3 days ago | parent | prev [-] |
| "brain as computer" is just the latest iteration of a line of thinking that goes back forever. Whatever we kinda understand and interact with, that's what we are and what the brain is. Chemicals, electricity, clocks, steam engines, fire, earth; they're all analogies that help us learn but don't necessarily reflect an underlying reality. |