▲ | vidarh 6 days ago | |||||||||||||||||||||||||||||||||||||
> Probabilistic generative models are fun but no amount of probabilistic sequence generation can be a substitute for logical reasoning. Unless you either claim that humans can't do logical reasoning, or claim humans exceed the Turing computable, then given you can trivially wire an LLM into a Turing complete system, this reasoning is illogical due to Turing equivalence. And either of those two claims lack evidence. | ||||||||||||||||||||||||||||||||||||||
▲ | voidhorse 5 days ago | parent | next [-] | |||||||||||||||||||||||||||||||||||||
Such a system redefines logical reasoning to the point that hardly any typical person's definition would agree. It's Searle's Chinese Room scenario all over again, which everyone seems to have forgotten amidst the bs marketing storm around LLMs. A person with no knowledge of Chinese following a set of instructions and reading from a dictionary translating texts is a substitute for hiring a translator who understands chinese, however we would not claim that this person understands Chinese. An LLM hooked up to a Turing Machine would be similar wrt to logical reasoning. When we claim someone reasons logically we usually don't imagine they randomly throw ideas at the wall and then consult outputs to determine if they reasoned logically. Instead, the process of deduction makes the line of reasoning decidedly not stochastic. I can't believe we've gotten to such a mad place that basic notions like that of logical deduction are being confused for stochastic processes. Ultimately, I would agree that it all comes back to the problem of other minds and you either take a fully reductionist stance and claim the brain and intellection is nothing more than probabilistic neural firing or you take a non-reductionist stance and assume there may be more to it. In either case, I think that claiming that LLMs+tools are equivalent to whatever process humans perform is kind of silly and severely underrated what humans are capable of^1. 1: Then again, this has been going on since the dawn of computing, which has always put forth its brain=computer metaphors more on grounds of reducing what we mean by "thought" than by any real substantively justified connection. | ||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||
▲ | 11101010001100 6 days ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||||||||
So we just need a lot of monkeys at computers? | ||||||||||||||||||||||||||||||||||||||
▲ | sieabahlpark 6 days ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||||||||
[dead] | ||||||||||||||||||||||||||||||||||||||
▲ | godelski 6 days ago | parent | prev [-] | |||||||||||||||||||||||||||||||||||||
Please don't do the "the proof is trivial and left to the reader"[0].If it is so trivial, show it. Don't hand wave, "put up or shut up". I think if you work this out you'll find it isn't so trivial... I'm aware of some works but at least every one I know of has limitations that would not apply to LLMs. Plus, none of those are so trivial... | ||||||||||||||||||||||||||||||||||||||
|