Remix.run Logo
zahlman 5 days ago

> You might not be aware of what that concept means in terms of LLMs

GP is perfectly aware of this, and disagrees that the metaphor used to apply the term is apt.

Just because you use a word to describe a phenomenon doesn't actually make the phenomenon similar to others that were previously described with that word, in all the ways that everyone will find salient.

When AIs generate code that makes a call to a non-existent function, it's not because they are temporarily mistakenly perceiving (i.e., "hallucinating") that function to be mentioned in the documentation. It's because the name they've chosen for the function fits their model for what a function that performs the necessary task might be called.

And even that is accepting that they model the task itself (as opposed to words and phrases that describe the task) and that they somehow have the capability to reason about that task, which has somehow arisen from a pure language model (whereas humans can, from infancy, actually observe reality, and contemplate the effect of their actions upon the real world around them). Knowing that e.g. the word "oven" often follows the word "hot" is not, in fact, tantamount to understanding heat.

In short, they don't perceive, at all. So how can they be mistaken in their perception?