Remix.run Logo
abenga a day ago

Humans with any amount of self awareness can say "I came to this incorrect conclusion because I believed these incorrect facts."

pbh101 a day ago | parent [-]

Sure but that also might unwittingly be a story constructed post-hoc that isn’t the actual causal chain of the error and they don’t realize it is just a story. Many cases. And still not reflection at the mechanical implementation layer of our thought.

semiquaver a day ago | parent | next [-]

Yep. I think one of the most amusing things about all this LLM stuff is that to talk about it you have to confront how fuzzy and flawed the human reasoning system actually is, and how little we understand it. And yet it manages to do amazing things.

s1artibartfast a day ago | parent | prev [-]

I think humans can actually apply logical rigor. Both humans and models rely and stories. It is stories all the way down.

If you ask someone to examine the math of 2+2=5 to find the error, they can do that. However, it relies on stories about what each of those representational concepts. what is a 2 and a 5, and how do they relate each other and other constructs.