Remix.run Logo
hodgehog11 6 days ago

The "sufficiently interesting" part is the most important qualifier here. My response was talking about theories and representations that we already know, either instinctively from near-birth, or from learned experience. We have not seen anything unique from LLMs because they do not appear to have an internal understanding (in the same sense that I was talking about) that is as broad as an adult human. But that doesn't mean it lacks any understanding.

> The key point in all of this is symbolism as abstractions to represent things in some world.

The difficulty is understanding how to extract this information from the model, since the output of the LLM is actually a very poor representation of its internal state.