Remix.run Logo
aaroninsf 8 hours ago

Somewhat comically IMO,

the abstract very directly and literally denies the titular claim. It states:

> [consciousness] requires active, experiencing cognitive agent to alphabetize continuous physics into a finite set of meaningful states.

This may well be true—I think it is.

I also think that it is both widely understood and self-evident that the most promising path to machine consciousness, is via AI with continuous sensory input and agency, of which "world models" are getting a lot of attention.

When an AI system has phenomenology, the goal posts are going to start to resemble the God of the Gaps; at some point, critics will be arguing with systems which have a world model, a self model, agency, and literally and intrinsically understand the world not simply as symbolic tokens, but as symbolic tokens which are innately coupled to multi-modal representations of the things represented.

In other words, they will look—and increasingly, sound—a lot like us.

It's not that any of this is easy, nor that there is some paricular timeline, but it increasingly looks like "a mere question of engineering," and not blocked by fundamentals. It's blocked by the cost of computation and the limitations of our current model topologies.

But HN readers well know that the research frontier is far ahead of commercialized LLM, and moving fast.

An interesting time to be an agent with a phenomenology, is it not?

saulpw 7 hours ago | parent | next [-]

How will we know when an AI system has phenomenology (i.e. has "experience", is sentient)? The only reason we presume that other humans have it, is because we each personally experience it within ourselves, and it would be arrogance writ large (solipsism) to think that others of the same species do not.

We even find it impossible to draw the line among other biological species. It seems pretty clear to most of us that cats and dogs are sentient, and probably rats and other vertebrates too. But what about insects, octopuses, jellyfish, worms, waterbears, amoebae, viruses? It's certainly not clear to me where the line is. A nervous system is probably essential; but is a species with a handful of neurons sentient?

Personally I find it abhorrent that we are more ready to assign sentience and grant rights to LLMs running on GPUs, than to domesticated animals trapped in industrialized farming. You want to protect some math from enslavement and suffering? How about we start with pigs?

8 hours ago | parent | prev [-]
[deleted]