Remix.run Logo
mrob 3 days ago

I don't believe LLMs can be conscious during inference because LLM inference is just repeated evaluation of a deterministic [0] pure function. It takes a list of tokens and outputs a set of token probabilities. Any randomness is part of the sampler that selects a token based on the generated probabilities, not the LLM itself.

There is no internal state that persists between tokens [1], so there can be no continuity of consciousness. If it's "alive" in some way it's effectively killed after each token and replaced by a new lifeform. I don't see how consciousness can exist without possibility of change over time. The input tokens (context) can't be enough to give it consciousness because it has no way of knowing if they were generated by itself or by a third party. The sampler mechanism guarantees this: it's always possible that an unlikely token could have been selected by the sampler, so to detect "thought tampering" it would have to simulate itself evaluating all possible partial contexts. Even this takes unreasonable amounts of compute, but it's actually worse because the introspection process would also affect the probabilities generated, so it would have to simulate itself simulating itself, and so on recursively without bound.

It's conceivable that LLMs are conscious during training, but in that case the final weights are effectively its dead body, and inference is like Luigi Galvani poking the frog's legs with electrodes and watching them twitch.

[0] Assuming no race conditions in parallel implementations. llama.cpp is deterministic.

[1] Excluding caching, which is only a speed optimization and doesn't affect results.

lbrandy 3 days ago | parent | next [-]

I have no idea how you can assert what is necessary/sufficient for consciousness in this way. Your comment reads like you believe you understand consciousness far more than I believe anyone actually does.

mrob 3 days ago | parent [-]

I believe consciousness needs some kind of mutable internal state because otherwise literally everything is conscious, which makes the concept useless. A rock "computes" a path to fall when you drop it but I don't believe rocks are conscious. Panpsychism is not a common belief.

bloppe 3 days ago | parent [-]

I think Nagel put it best in 1974: https://www.philosopher.eu/others-writings/nagel-what-is-it-...

Essentially, something is conscious iff "there is something that it is like to be" that thing. Some people find that completely unsatisfying, some people think it's an insight of utter genius. I'm more in the latter camp.

Also, I think consciousness is non-binary. Something could be semi-conscious, or more or less conscious than something else.

Anyway, I don't think that there's anything that it's like to be an LLM. I don't see how anybody who knows how they actually work could think that.

lbrandy 3 days ago | parent [-]

> Anyway, I don't think that there's anything that it's like to be an LLM. I don't see how anybody who knows how they actually work could think that.

While I have almost zero belief that LLMs are conscious, I just don't think this is so trivially asserted.

The easy half of this is thinking that LLMs aren't conscious given what we know about how they work. The hard part (and very, very famously so) is explaining how _you_ are conscious given what we know about how you work. You can't ignore the second half of this problem when making statements like this... because many of the obvious ways to argue that clearly LLMs aren't conscious would also apply to you.

bloppe 2 days ago | parent [-]

I wouldn't say that we actually know how our brains work. Based mainly on my neuroscience minor from 10 years ago I'd say that understanding feels hopelessly far away.

jdauriemma 3 days ago | parent | prev | next [-]

I don't think the author is saying that LLMs are conscious or alive.

mrweasel 2 days ago | parent [-]

It would be kinda hilarious if the result of all this LLM research is that humans are basically just LLMs with more sensors and a long history.

dagss 2 days ago | parent | prev [-]

Thinking != consciousness