| ▲ | umanwizard 3 days ago |
| You can simulate a human brain on pen and paper too. |
|
| ▲ | palmotea 3 days ago | parent | next [-] |
| > You can simulate a human brain on pen and paper too. That's an assumption, though. A plausible assumption, but still an assumption. We know you can execute an LLM on pen and paper, because people built them and they're understood well enough that we could list the calculations you'd need to do. We don't know enough about the human brain to create a similar list, so I don't think you can reasonably make a stronger statement than "you could probably simulate..." without getting ahead of yourself. |
| |
| ▲ | terminalshort 3 days ago | parent | next [-] | | I can make a claim much stronger than "you could probably" The counterclaim here is that the brain may not obey physical laws that can be described by mathematics. This is a "5G causes covid" level claim. The overwhelming burden of proof is on you. | | |
| ▲ | frozenlettuce 3 days ago | parent | next [-] | | There are some quantum effects in the brain (for some people, that's a possible source of consciousness).
We can simulate quantum effects, but here comes the tricky part: even if our simulation matches the probability, say 70/30 of something happening, what guarantees that our simulation would take the same path as the object being simulated? | | |
| ▲ | daedrdev 3 days ago | parent | next [-] | | We don't have to match the quantum state since the brain still produces an valid output regardless of what each random quantum probability ended up as. And we can include random entropy in a LLM too. | |
| ▲ | terminalshort 3 days ago | parent | prev [-] | | This is just non-determinism. Not only can't your simulation reproduce the exact output, but neither can your brain reproduce its own previous state. This doesn't mean it's a fundamentally different system. |
| |
| ▲ | kipchak 3 days ago | parent | prev [-] | | Consider for example Orch OR theory. If it or something like it were to be accurate, the brain would not "obey physical laws that can be described by mathematics". | | |
| ▲ | bondarchuk 3 days ago | parent | next [-] | | >Consider for example Orch OR theory Yes, or what about leprechauns? | | |
| ▲ | kipchak 3 days ago | parent [-] | | Orch OR is probably wrong, but the broader point is that we still don’t know which physical processes are necessary for cognition. Until we do, claims of definitive brain simulability are premature. |
| |
| ▲ | DoctorOetker 3 days ago | parent | prev [-] | | the transition probability matrices don't obey the laws of statistics? |
|
| |
| ▲ | hnfong 3 days ago | parent | prev [-] | | This is basically the Church-Turing thesis and one of the motivations of using tape(paper) and an arbitrary alphabet in the Turing machine model. It's been kinda discussed to oblivion in the last century, interesting that it seems people don't realize the "existing literature" and repeat the same arguments (not saying anyone is wrong). |
|
|
| ▲ | phantasmish 3 days ago | parent | prev | next [-] |
| The simulation isn't an operating brain. It's a description of one. What it "means" is imposed by us, what it actually is, is a shitload of graphite marks on paper or relays flipping around or rocks on sand or (pick your medium). An arbitrarily-perfect simulation of a burning candle will never, ever melt wax. An LLM is always a description. An LLM operating on a computer is identical to a description of it operating on paper (if much faster). |
| |
| ▲ | gnull 3 days ago | parent | next [-] | | What makes the simulation we live in special compared to the simulation of a burning candle that you or I might be running? That simulated candle is perfectly melting wax in its own simulation. Duh, it won't melt any in ours, because our arbitrary notions of "real" wax are disconnected between the two simulatons. | | |
| ▲ | hnfong 3 days ago | parent [-] | | They do have a valid subtle point though. If we don't think the candle in a simulated universe is a "real candle", why do we consider the intelligence in a simulated universe possibly "real intelligence"? Being a functionalist ( https://en.wikipedia.org/wiki/Functionalism_(philosophy_of_m... ) myself, I don't know the answer on the top of my head. | | |
| ▲ | hackinthebochs 3 days ago | parent | next [-] | | >If we don't think the candle in a simulated universe is a "real candle", why do we consider the intelligence in a simulated universe possibly "real intelligence"? I can smell a "real" candle, a "real" candle can burn my hand. The term real here is just picking out a conceptual schema where its objects can feature as relata of the same laws, like a causal compatibility class defined by a shared causal scope. But this isn't unique to the question of real vs simulated. There are causal scopes all over the place. Subatomic particles are a scope. I, as a particular collection of atoms, am not causally compatible with individual electrons and neutrons. Different conceptual levels have their own causal scopes and their own laws (derivative of more fundamental laws) that determine how these aggregates behave. Real (as distinct from simulated) just identifies causal scopes that are derivative of our privileged scope. Consciousness is not like the candle because everyone's consciousness is its own unique causal scope. There are psychological laws that determine how we process and respond to information. But each of our minds are causally isolated from one another. We can only know of each other's consciousness by judging behavior. There's nothing privileged about a biological substrate when it comes to determining "real" consciousness. | | |
| ▲ | hnfong 3 days ago | parent [-] | | Right, but doesn't your argument imply that the only "real" consciousness is mine? I'm not against this conclusion ( https://en.wikipedia.org/wiki/Philosophical_zombie ) but it doesn't seem to be compatible with what most people believe in general. | | |
| ▲ | hackinthebochs 3 days ago | parent | next [-] | | That's a fair reading but not what I was going for. I'm trying to argue for the irrelevance of causal scope when it comes to determining realness for consciousness. We are right to privilege non-virtual existence when it comes to things whose essential nature is to interact with our physical selves. But since no other consciousness directly physically interacts with ours, it being "real" (as in physically grounded in a compatible causal scope) is not an essential part of its existence. Determining what is real by judging causal scope is generally successful but it misleads in the case of consciousness. | | |
| ▲ | hnfong 2 days ago | parent [-] | | I don't think causal scope is what makes a virtual candle virtual. If I make a button that lights the candle, and another button that puts it off, and I press those buttons, then the virtual candle is causally connected to our physical reality world. But obviously the candle is still considered virtual. Maybe a candle is not as illustrative, but let's say we're talking about a very realistic and immersive MMORPG. We directly do stuff in the game, and with the right VR hardware it might even feel real, but we call it a virtual reality anyway. Why? And if there's an AI NPC, we say that the NPC's body is virtual -- but when we talk about the AI's intelligence (which at this point is the only AI we know about -- simulated intelligence in computers) why do we not automatically think of this intelligence as virtual in the same way as a virtual candle or a virtual NPC's body? | | |
| ▲ | hackinthebochs 2 days ago | parent [-] | | Yes, causal scope isn't what makes it virtual. It's what makes us say it's not real. The real/virtual dichotomy is what I'm attacking. We treat virtual as the opposite of real, therefore a virtual consciousness is not real consciousness. But this inference is specious. We mistake the causal scope issue for the issue of realness. We say the virtual candle isn't real because it can't burn our hand. What I'm saying is that, actually the virtual candle can't burn our hand because of the disjoint causal scope. But the causal scope doesn't determine what is real, it just determines the space and limitations of potential causal interactions. Real is about an object having all of the essential properties for that concept. If we take it as essential that candles can burn our hand, then the virtual candle isn't real. But it is not essential to consciousness that it is not virtual. |
|
| |
| ▲ | grantcas a day ago | parent | prev [-] | | [dead] |
|
| |
| ▲ | BobbyJo 3 days ago | parent | prev | next [-] | | > If we don't think the candle in a simulated universe is a "real candle", why do we consider the intelligence in a simulated universe possibly "real intelligence"? A candle in Canada can't melt wax in Mexico, and a real candle can't melt simulated wax. If you want to differentiate two things along one axis, you can't just point out differences that may or may not have any effect on that axis. You have to establish a causal link before the differences have any meaning. To my knowledge, intelligence/consciousness/experience doesn't have a causal link with anything. We know our brains cause consciousness the way we knew in 1500 that being on a boat for too long causes scurvy. Maybe the boat and the ocean matter, or maybe they don't. | |
| ▲ | phantasmish 3 days ago | parent | prev [-] | | I think the core trouble is that it's rather difficult to simulate anything at all without requiring a human in the loop before it "works". The simulation isn't anything (well, it's something, but it's definitely not what it's simulating) until we impose that meaning on it. (We could, of course, levy a similar accusation at reality, but folks tend to avoid that because it gets uselessly solipsistic in a hurry) A simulation of a tree growing (say) is a lot more like the idea of love than it is... a real tree growing. Making the simulation more accurate changes that not a bit. |
|
| |
| ▲ | penteract 3 days ago | parent | prev | next [-] | | I believe that the important part of a brain is the computation it's carrying out. I would call this computation thinking and say it's responsible for consciousness. I think we agree that this computation would be identical if it were simulated on a computer or paper.
If you pushed me on what exactly it means for a computation to physically happen and create consciousness, I would have to move to statements I'd call dubious conjectures rather than beliefs - your points in other threads about relying on interpretation have made me think more carefully about this. Thanks for stating your views clearly. I have some questions to try and understand them better: Would you say you're sure that you aren't in a simulation while acknowledging that a simulated version of you would say the same? What do you think happens to someone whose neurons get replaced by small computers one by one (if you're happy to assume for the sake of argument that such a thing is possible without changing the person's behavior)? | |
| ▲ | cibyr 3 days ago | parent | prev | next [-] | | It seems to me that the distinction becomes irrelevant as soon as you connect inputs and outputs to the real world. You wouldn't say that a 737 autopilot can never, ever fly a real jet and yet it behaves exactly the same whether it's up in the sky or hooked up to recorded/simulated signals on a test bench. | |
| ▲ | amelius 3 days ago | parent | prev | next [-] | | Here is a thought experiment: Build a simulation of creatures that evolve from simple structures (think RNA, DNA). Now, if in this simulation, after many many iterations, the creatures start talking about consciousness, what does that tell us? | |
| ▲ | amelius 3 days ago | parent | prev [-] | | > An arbitrarily-perfect simulation of a burning candle will never, ever melt wax. It might if the simulation includes humans observing the candle. |
|
|
| ▲ | andrepd 3 days ago | parent | prev | next [-] |
| It's an open problem whether you can or not. |
| |
| ▲ | space_fountain 3 days ago | parent [-] | | It’s not that open. We can simulate smaller system of neurons just fine, we can simulate chemistry. There might be something beyond that in our brains for some reason, but it sees doubtful right now | | |
| ▲ | phantasmish 3 days ago | parent | next [-] | | Our brains actually do something, may be the difference. They're a thing happening, not a description of a thing happening. Whatever that something that it actually does in the real, physical world is produces the cogito in cogito, ergo sum and I doubt you can get it just by describing what all the subatomic particles are doing, any more than a computer or pen-and-paper simulated hurricane can knock your house down, no matter how perfectly simulated. | | |
| ▲ | thrance 3 days ago | parent | next [-] | | You're arguing for the existence of a soul, for dualism. Nothing wrong with that, except we have never been able to measure it, and have never had to use it to explain any phenomenon of the brain's working. The brain follows the rules of physics, like any other objects of the material world. A pen and paper simulation of a brain would also be "a thing happening" as you put it. You have to explain what is the magical ingredient that makes the brain's computations impossible to replicate. You could connect your brain simulation to an actual body, and you'd be unable to tell the difference with a regular human, unless you crack it open. | | |
| ▲ | phantasmish 3 days ago | parent [-] | | > You're arguing for the existence of a soul, for dualism. I'm not. You might want me to be, but I'm very, very much not. |
| |
| ▲ | ehsanu1 3 days ago | parent | prev | next [-] | | Doing something merely requires I/O. Brains wouldn't be doing much without that. A sufficiently accurate simulation of a fundamentally computational process is really just the same process. | |
| ▲ | terminalshort 3 days ago | parent | prev [-] | | Why are the electric currents moving in a GPU any less of a "thing happening" than the firing of the neurons in your brain? What you are describing here is a claim that the brain is fundamentally supernatural. | | |
| ▲ | phantasmish 3 days ago | parent [-] | | Thinking that making scribbles that we interpret(!!!) as perfectly describing a functioning consciousness and its operation, on a huge stack of paper, would manifest consciousness in any way whatsoever (hell, let's say we make it an automated flip-book, too, so it "does something"), but if you made the scribbles slightly different it wouldn't work(!?!? why, exactly, not ?!?!), is what's fundamentally supernatural. It's straight-up Bronze Age religion kinds of stuff (which fits—the tech elite is full of that kind of shit, like mummification—er, I mean—"cryogenic preservation", millenarian cults er, I mean The Singularity, et c) Of course a GPU involves things happening. No amount of using it to describe a brain operating gets you an operating brain, though. It's not doing what a brain does. It's describing it. (I think this is actually all somewhat tangential to whether LLMs "can think" or whatever, though—but the "well of course they might think because if we could perfectly describe an operating brain, that would also be thinking" line of argument often comes up, and I think it's about as wrong-headed as a thing can possibly be, a kind of deep "confusing the map for the territory" error; see also comments floating around this thread offhandedly claiming that the brain "is just physics"—like, what? That's the cart leading the horse! No! Dead wrong!) | | |
| ▲ | hackinthebochs 3 days ago | parent [-] | | Computation doesn't care about its substrate. A simulation of a computation is just a computation. |
|
|
| |
| ▲ | 3 days ago | parent | prev [-] | | [deleted] |
|
|
|
| ▲ | pton_xd 3 days ago | parent | prev | next [-] |
| So the brain is a mathematical artifact that operates independently from time? It just happens to be implemented using physics? Somehow I doubt it. |
| |
| ▲ | thrance 3 days ago | parent [-] | | The brain follows the laws of physics. The laws of physics can be closely approximated by mathematical models. Thus, the brain can be closely approximated by mathematical models. |
|
|
| ▲ | an0malous 3 days ago | parent | prev [-] |
| Parent said replicate, as in deterministically |