Remix.run Logo
da_chicken 5 days ago

I saw it somewhere else recently, but the idea is that LLMs are language models, not world models. This seems like a perfect example of that. You need a world model to navigate a text game.

Otherwise, how can you determine that "North" is a context change, but not always a context change.

zahlman 5 days ago | parent | next [-]

> I saw it somewhere else recently, but the idea is that LLMs are language models, not world models.

Part of what distinguishes humans from artificial "intelligence" to me is exactly that we automatically develop models of whatever is needed.

mlyle 4 days ago | parent | next [-]

I think it's interesting to think about, and still somewhat uncertain:

* How much a large language model is effectively a world model (indeed, language tries to model the world...)?

* How much do humans use language in their modeling and reasoning about the world?

* How fit is language for this task, beyond the extent humans use it for?

da_chicken 4 days ago | parent | prev [-]

I think that's true to some extent, but I think all animals probably develop a world model.

foobarbecue 5 days ago | parent | prev | next [-]

On HN, perhaps? #17 on the front page right now: https://news.ycombinator.com/item?id=44854518

manbash 5 days ago | parent | prev | next [-]

Thanks for this. I was struggling to put it in words even if maybe this has been a known distinguishing factor for others.

myhf 5 days ago | parent | prev | next [-]

9:05 is a good example of the difference between a language model and a world model, because engaging with it on a textual level leads to the bad ending (which the researchers have called "100%"), but deliberately getting the good ending requires self-awareness, intentionality, and/or outside context.

lubujackson 5 days ago | parent | prev [-]

Why, this sounds like Context Engineering!