Remix.run Logo
naasking 2 days ago

> LLMs do a good job of teasing it out that resembles reasoning/thinking but really isn't.

Given the fact that "thinking" still hasn't been defined rigourously, I don't understand how people are so confident in claiming they don't think.

notepad0x90 2 days ago | parent [-]

reasoning might be a better term to discuss as it is more specific?

naasking 2 days ago | parent [-]

It too isn't rigourously defined. We're very much at the hand-waving "I know it when I see it" [1] stage for all of these terms.

[1] https://en.wikipedia.org/wiki/I_know_it_when_I_see_it

notepad0x90 18 hours ago | parent [-]

I can't speak for academic rigor, but it is very clear and specific from my understanding at least. Reasoning, simply put is the ability to come to a conclusion after analyzing information using a logic-derived deterministic algorithm.

naasking 17 hours ago | parent [-]

* Humans are not deterministic.

* Humans that make mistakes are still considered to be reasoning.

* Deterministic algorithms have limitations, like Goedel incompleteness, which humans seem able to overcome, so presumably, we expect reasoning to also be able to overcome such challenges.

notepad0x90 5 hours ago | parent [-]

1) I didn't say we were, but when someone is called reasonable or acting with reason, then that implies deterministic/algorithmic thinking. When we're not deterministic, we're not reasonable.

2) Yes, to reason does imply to be infallible. The deterministic algorithms we follow are usually flawed.

3) I can't speak much to that, but I speculate that if "AI" can do reasoning, it would be a much more complex construct that uses LLMs (among other tools) as tools and variables like we do.