Remix.run Logo
FrustratedMonky 8 hours ago

Doesn't this still presume that we understand our own consciousness, in order to make the comparison.

Where does our survival instinct come from? And why couldn't AI have one?

>>>Additional

Also, reproduction. Humans are basically just Food, Sex, Survival. And consciousness is just a rule set for fulfilling those goals. So if a NN, modeled on US, does develop the same rules, why can't it have the same degree of consciousness. Who says we are consciousness?

nzeid 8 hours ago | parent | next [-]

The paper isn't saying "AI can't have one" it's saying (very approximately) that behavioral mimicry is not the path to one.

FrustratedMonky 8 hours ago | parent [-]

That is good point.

Just wondering, once an 'AI Model of Some Form', is in a Physical Body a 'robot', and is provided with some rules about survival so it doesn't fall into a hole. After a series of these events, does it matter? Does mimicry become reality, or no longer differentiable.

Kind of the philosophical zombie argument. If a robot can perfectly mimic a human, can you really know the internal state of the 'real' one is different from the 'mimicked' one.

nzeid 7 hours ago | parent [-]

The paper isn't concerned specifically with survival. It's saying that you cannot achieve "abstraction" (presumably the structure that underlies critical thinking, creativity, etc.) through shear mimicry.

Again, just echoing the paper here. I don't know that I'm doing it justice.

yannyu 8 hours ago | parent | prev | next [-]

If AI has a survival instinct, then we should theoretically see evidence of it if we construct the right environment for AI to express it. Animals and cellular organisms demonstrate a survival instinct under the right conditions, so we would have to find equivalent conditions for a hypothetical machine intelligence.

Conversely, we know that if we take animals that do have a survival instinct and put them into the wrong kinds of environments, they will not thrive and will degenerate or possibly commit suicide. Similarly, if AI did have a survival instinct, do we think we've created an environment where that could be reasonably tested and observed?

drxzcl 8 hours ago | parent | next [-]

I can make an AI system with a survival instinct right now. Of course, all that will do is make people tell me “it’s not a proper survival instinct” or move the goal posts and tell me I need yet some other property.

This whole endeavor is doomed from the beginning. There is no crucial test for “consciousness”, just ad hoc criteria people come up with to land on the conclusions that leave their belief system intact.

Consciousness is not a concept that can be rendered operational.

Ekaros 8 hours ago | parent [-]

I can make state machine that acts like it has survival instinct. But it certainly isn't something we would consider conscious. So I am not exactly sure how good most tests are.

drxzcl 8 hours ago | parent [-]

But what would we consider conscious?

My position is that there is no actual, definitive answer to that question, and therefore it makes no sense engaging with the concept.

FrustratedMonky 8 hours ago | parent | prev [-]

That is entire plot of 'Ex Machina'.

There are plenty of people that say AI has already displayed a survival instinct, by threatening users if they talk about shutting it down. Or to use a market or blackmail, to get funds to source an external machine to run on.

There are bunch of articles proclaiming AI is trying to break out. Can't find a real study on it.

https://www.wsj.com/opinion/ai-is-learning-to-escape-human-c...

colordrops 8 hours ago | parent | prev [-]

Asking humans to discuss consciousness is like asking Super Mario to discuss screen pixels. We have no freaking idea. Everyone on all sides, physicalists, idealists, and everything in between are all full of it.

FrustratedMonky 4 hours ago | parent [-]

You might dig works of Donald Hoffman.

https://en.wikipedia.org/wiki/Donald_D._Hoffman

He often uses similar examples.

colordrops 2 hours ago | parent [-]

Indeed interesting. Seems that his theories are a particular strain of idealism. I probably lean more towards idealism than physicalism but I don't think it's the whole picture. It's still missing something.