Remix.run Logo
subjectivationx 16 hours ago

Everyone reading this understands the meaning of a sunrise. It is a wonderful example of the use theory of meaning.

If you raised a baby inside a windowless solitary confinement cell for 20 years and then one day show them the sunrise on a video monitor, they still don't understand the meaning of a sunrise.

Trying to extract the meaning of a sunrise by a machine from the syntax of a sunrise data corpus is just totally absurd.

You could extract some statistical regularity from the pixel data of the sunrise video monitor or sunrise data corpus. That model may provide some useful results that can then be used in the lived world.

Pretending the model understands a sunrise though is just nonsense.

Showing the sunrise statistical model has some use in the lived world as proof the model understands a sunrise I would say borders on intellectual fraud considering a human doing the same thing wouldn't understand a sunrise either.

ajross 15 hours ago | parent [-]

> Everyone reading this understands the meaning of a sunrise

For a definition of "understands" that resists rigor and repeatability, sure. This is what I meant by reducing it to a semantic argument. You're just saying that AI is impossible. That doesn't constitute evidence for your position. Your opponents in the argument who feel AGI is imminent are likewise just handwaving.

To wit: none of you people have any idea what you're talking about. No one does. So take off the high hat and stop pretending you do.

meroes 13 hours ago | parent [-]

This all just boils down to the Chinese Room thought experiment, where Im pretty sure the consensus is nothing in the experiment (not the person inside, the whole emergent room, etc) understands Chinese like us.

Another example by Searle is a computer simulating digestion is not digesting like a stomach.

The people saying AI can’t form from LLMs are in the consensus side of the Chinese Room. The digestion simulator could tell us where every single atom is of a stomach digesting a meal, and it’s still not digestion. Only once the computer simulation breaks down food particles chemically and physically is it digestion. Only once an LLM received photons or has a physical capacity to receive photons is there anything like “seeing a night sky”.