Remix.run Logo
polytely 7 months ago

> What separates this from human.

A lot. Like an incredible amount. A description of a thing is not the thing.

There is sensory input, qualia, pleasure & pain.

There is taste and judgement, disliking a character, being moved to tears by music.

There are personal relationships, being a part of a community, bonding through shared experience.

There is curiosity and openeness.

There is being thrown into the world, your attitude towards life.

Looking at your thoughts and realizing you were wrong.

Smelling a smell that resurfaces a memory you forgot you had.

I would say the language completion part is only a small part of being human.

Aeolun 7 months ago | parent | next [-]

All of these things arise from a bunch of inscrutable neurons in your brain turning off and on again in a bizarre pattern though. Who’s to say that isn’t what happens in the million neuron LLM brain.

Just because it’s not persistent doesn’t mean it’s not there.

Like, I’m sort of inclined to agree with you, but it doesn’t seem like it’s something uniquely human. It’s just a matter of degree.

jessemcbride 7 months ago | parent | next [-]

Who's to say that weather models don't actually get wet?

EricDeb 7 months ago | parent | prev | next [-]

I think you would need the biological components of a nervous system for some of these things

lordnacho 7 months ago | parent [-]

Why couldn't a different substrate produce the same structure?

elcritch 7 months ago | parent | prev [-]

Sure in some ways it's just neurons firing in some pattern. Figuring out and replicating the correct sets of neuron patterns is another matter entirely.

Living creatures have fundamental impetus to grow and reproduce that LLMS and AIS simply do not have currently. Not only that but animals have a highly integrated neurology that has billions of years of being tune to that impetus. For example the ways that sex interacts with mammalian neurology is pervasive. Same with need for food, etc. That creates very different neural patterns than training LLMS does.

Eventually we may be able to re-create that balance of impetus, or will, or whatever we call it, to make sapience. I suspect we're fairly far from that, if only because the way LLMs we create LLMs are so fundamentally different.

CrulesAll 7 months ago | parent | prev | next [-]

"I would say the language completion part is only a small part of being human" Even that is only given to them. A machine does not understand language. It takes input and creates output based on a human's algorithm.

ekianjo 7 months ago | parent [-]

> A machine does not understand language

You can't prove humans do either. You can see how many times actual people with understanding something that's written for them. In many ways, you can actually prove that LLMs are superior to humans right now when it comes to understanding text.

girvo 7 months ago | parent | next [-]

> In many ways, you can actually prove that LLMs are superior to humans right now when it comes to understanding text

Emphasis mine.

No, I don't think you can, without making "understanding" a term so broad as to be useless.

CrulesAll 7 months ago | parent | prev [-]

"You can't prove humans do either." Yes you can via results and cross examination. Humans are cybernetic systems(the science not the sci-fi). But you are missing the point. LLMs are code written by engineers. Saying LLMs understand text is the same as saying a chair understands text. LLMs' 'understanding' is nothing more than the engineers synthesizing linguistics. When I ask an A'I' the Capital of Ireland, it answers Dublin. It does not 'understand' the question. It recognizes the grammar according to an algorithm, and matches it against a probabilistic model given to it by an engineer based on training data. There is no understanding in any philosophical nor scientific sense.

lordnacho 7 months ago | parent [-]

> When I ask an A'I' the Capital of Ireland, it answers Dublin. It does not 'understand' the question.

You can do this trick as well. Haven't you ever been to a class that you didn't really understand, but you can give correct answers?

I've had this somewhat unsettling experience several times. Someone asks you a question, words come out of your mouth, the other person accepts your answer.

But you don't know why.

Here's a question you probably know the answer to, but don't know why:

- I'm having steak. What type of red wine should I have?

I don't know shit about Malbec, I don't know where it's from, I don't know why it's good for steak, I don't know who makes it, how it's made.

But if I'm sitting at a restaurant and someone asks me about wine, I know the answer.

the_gipsy 7 months ago | parent | prev | next [-]

That's a lot of words shitting on a lot of words.

You said nothing meaningful that couldn't also have been spat out by an LLM. So? What IS then the secret sauce? Yes, you're a never resting stream of words, that took decades not years to train, and has a bunch of sensors and other, more useless, crap attached. It's technically better but, how does that matter? It's all the same.

DubiousPusher 7 months ago | parent | prev [-]

lol, qualia