Remix.run Logo
tracerbulletx 7 hours ago

We don't even know what the pre-requisites for consciousness are so we have no way of knowing. LLMs have emergent behavior that is reminiscent of language forming brains, but they're also missing a lot of properties that are probably necessary? Mainly continuity over time, more integrated memory, and a better sense of space and time? Brains use the rhythm and timing of neuronal firings, and the length of axons effects computation, they do a lot of different things with signal and patterns, but in any case without knowing what consciousness is I don't know which of those things are required.

boxed an hour ago | parent | next [-]

> We don't even know what the pre-requisites for consciousness are so we have no way of knowing.

Imo we don't even have a definition of the word that we agree on.

qsera 15 minutes ago | parent | next [-]

Ability to feel pain or pleasure is a good indicator I think..

echoangle 2 minutes ago | parent [-]

And how do you define pain and pleasure? Do insects feel pain?

pydry 37 minutes ago | parent | prev [-]

We're pretty clear on the distinction between a conscious and an unconscious human.

We might not clearly understand the diff between the two states but we can certainly point to it and go "it's that".

freedomben 33 minutes ago | parent | next [-]

I'm not sure it's that clear. What about a person who is on drugs to the point they clearly don't know what reality is happening around them, but they are able to speak and move and such? I'm not sure I'd call that conscious, but by most definitions it is.

agnosticmantis 31 minutes ago | parent | prev [-]

Now discuss whether a bonobo, a dog, a cat, a mouse, an ant, a bacterium is conscious.

And you’ll find it’s not as clear cut.

throwuxiytayq an hour ago | parent | prev [-]

Clive Wearing's memory lasts for less than 30 seconds, so he has no memory of being awake before now. He is permanently in a state of feeling like he has just woken up, observing his surroundings for the first time.

Clive Wearing's mind has no time continuity and basically zero memory integration. Is he not conscious? There's interviews with the guy.

Where on the scale [No mind <-> Clive Wearing <-> Healthy human brain] would you put an LLM with a 10M token context window?