|
| ▲ | dragonwriter 3 days ago | parent | next [-] |
| > Humans don't require input to, say, decide to go for a walk. Impossible to falsify since humans are continuously receiving inputs from both external and internal sensors. > What's missing in the LLM is volition. What's missing is embodiment, or, at least, a continuous loop feeding a wide variety of inputs about the state of world. Given that, and info about of set of tools by which it can act in the world, I have no doubt that current LLMs would exhibit some kind (possibly not desirable or coherent, from a human POV, at least without a whole lot of prompt engineering) of volitional-seeming action. |
|
| ▲ | jmcodes 3 days ago | parent | prev | next [-] |
| Our entire extistence and experience is nothing _but_ input. Temperature changes, visual stimulus, auditory stimulus, body cues, random thoughts firing, etc.. Those are all going on all the time. |
| |
| ▲ | goatlover 3 days ago | parent [-] | | Random thoughts firing wouldn't be input, they're an internal process to the organism. | | |
| ▲ | jmcodes 3 days ago | parent [-] | | It's a process that I don't have conscious control over. I don't choose to think random thoughts they appear. Which is different than thoughts I consciously choose to think and engage with. From my subjective perspective it is an input into my field of awareness. | | |
| ▲ | zeroonetwothree 3 days ago | parent [-] | | Your subjective experience is only the tip of the iceberg of your entire brain activity. The conscious part is merely a tool your brain uses to help it achieve its goals, there's no inherent reason to favor it. |
|
|
|
|
| ▲ | IanCal 2 days ago | parent | prev | next [-] |
| LLMs can absolutely generate output without input but we don’t have zero input. We don’t exist in a floating void with no light or sound or touch or heat or feelings from our own body. But again this doesn’t see to be the same thing as thinking. If I could only reply to you when you send me a message but could reason through any problem we discuss just like “able to want a walk” me could, would that mean I no longer could think? I think these are different issues. On that though, these see trivially solvable with loops and a bit of memory to write to and read from - would that really make the difference for you? A box setup to run continuously like this would be thinking? |
|
| ▲ | ithkuil 3 days ago | parent | prev | next [-] |
| It's as if a LLM is only one part of a brain, not the whole thing. So of course it doesn't do everything a human does, but it still can do some aspects of mental processes. Whether "thinking" means "everything a human brain does" or whether "thinking" means a specific cognitive process that we humans do, is a matter of definition. I'd argue that defining "thinking" independently of "volition" is a useful definition because it allows us to break down things in parts and understand them |
|
| ▲ | BeetleB 3 days ago | parent | prev | next [-] |
| > Humans don't require input to, say, decide to go for a walk. Very much a subject of contention. How do you even know you're awake, without any input? |
|
| ▲ | esafak 3 days ago | parent | prev [-] |
| I would not say it is missing but thankfully absent. |