Remix.run Logo
sothatsit 3 days ago

> No.

You are missing the forest for the trees by dismissing this so readily.

LLMs can solve IMO-level math problems, debug quite difficult bugs in moderately sized codebases, and write prototypes for very unique and weird coding projects. They solve difficult reasoning problems, and so I find it mystifying that people still work so hard to justify their belief that they're "not actually reasoning". They are flawed reasoners in some sense, but it seems ludicrous to me to suggest that they are not reasoning at all when they generalise to new logical problems so well.

Do you think humans are logical machines? No, we are not. Therefore, do we not reason?

southernplaces7 3 days ago | parent [-]

>Do you think humans are logical machines? No, we are not. Therefore, do we not reason?

No, but we are conscious, and we know we are conscious, which doesn't require being a logical being too. LLMs on the other hand aren't conscious and there's zero evidence that they are. Thus, they don't reason, since this, unlike logic, does require consciousness.

Why not avoid re-definining things into a salad mix of poor logic until you can pretend that something with no evidence in its favor is real.

sothatsit 2 days ago | parent [-]

The idea that reasoning requires conciousness is very silly. That's not to mention that conciousness is such a poorly defined term in the first place.