Remix.run Logo
KoolKat23 3 days ago

1. Consciousness itself is probably just an illusion, a phenomena/name of something that occurs when you bunch thinking together. Think of this objectively and base it on what we know of the brain. It literally is working off of what hardware we have, there's no magic.

2. That's just a well adapted neural network (I suspect more brain is left than you let on). Multimodal model making the most of its limited compute and whatever gpio it has.

3. Humans navigate a pre-existing map that is already built. We can't understand things in other dimensions and need to abstract this. We're mediocre at computation.

I know there's people that like to think humans should always be special.

adamzwasserman 3 days ago | parent | next [-]

1. 'Probably just an illusion' is doing heavy lifting here. Either provide evidence or admit this is speculation. You can't use an unproven claim about consciousness to dismiss concerns about conflating it with text generation.

2. Yes, there are documented cases of people with massive cranial cavities living normal lives. https://x.com/i/status/1728796851456156136. The point isn't that they have 'just enough' brain. it's that massive structural variation doesn't preclude function, which undermines simplistic 'right atomic arrangement = consciousness' claims.

3. You're equivocating. Humans navigate maps built by other humans through language. We also directly interact with physical reality and create new maps from that interaction. LLMs only have access to the maps - they can't taste coffee, stub their toe, or run an experiment. That's the difference.

KoolKat23 3 days ago | parent [-]

1. What's your definition of consciousness, let's start there. 2. Absolutely, it's a spectrum. Insects have function. 3. "Humans navigate maps built by other humans through language." You said it yourself. They use this exact same data, so why won't they know it if they used it. Humans are their bodies in the physical world.

adamzwasserman 3 days ago | parent [-]

1. I don't need to define consciousness to point out that you're using an unproven claim ('consciousness is probably an illusion') as the foundation of your argument. That's circular reasoning.

2. 'It's a spectrum' doesn't address the point. You claimed LLMs approximate brain function because they have similar architecture. Massive structural variation in biological brains producing similar function undermines that claim.

3. You're still missing it. Humans use language to describe discoveries made through physical interaction. LLMs can only recombine those descriptions. They can't discover that a description is wrong by stubbing their toe or running an experiment. Language is downstream of physical discovery, not a substitute for it

KoolKat23 2 days ago | parent [-]

1. You do. You probably have a different version of that and are saying I'm wrong merely for not holding your definition.

2. That directly addresses your point. In abstract it shows they're basically no different to multimodal models, train with different data types and it still works, perhaps even better. They train LLMs with images, videos, sound, and nowadays even robot sensor feedback, with no fundamental changes to the architecture see Gemini 2.5.

3. That's merely an additional input point, give it sensors or have a human relay that data. Your toe is relaying it's sensor information to your brain.

estearum 3 days ago | parent | prev | next [-]

> Consciousness itself is probably just an illusion

This is a major cop-out. The very concept of "illusion" implies a consciousness (a thing that can be illuded).

I think you've maybe heard that sense of self is an illusion and you're mistakenly applying that to consciousness, which is quite literally the only thing in the universe we can be certain is not an illusion. The existence of one's own consciousness is the only thing they cannot possibly be illuded about (note: the contents of said consciousness are fully up for grabs)

KoolKat23 3 days ago | parent [-]

I mean peoples perception of it being a thing rather than a set of systems. But if that's your barometer, I'll say models are conscious. They may not have proper agency yet. But they are conscious.

zeroonetwothree 3 days ago | parent | prev [-]

Consciousness is an emergent behavior of a model that needs to incorporate its own existence into its predictions (and perhaps to some extent the complex behavior of same-species actors). So whether or not that is an 'illusion' really depends on what you mean by that.

KoolKat23 2 days ago | parent [-]

My use of the term illusion is more shallow than that, I merely use it as people think it's something separate and special.

Based on what you've described the models already demonstrate this, it is implied for example in the models attempts to game tests to ensure survival/release into the wild.