▲ | da_chicken 5 days ago | |||||||||||||
I saw it somewhere else recently, but the idea is that LLMs are language models, not world models. This seems like a perfect example of that. You need a world model to navigate a text game. Otherwise, how can you determine that "North" is a context change, but not always a context change. | ||||||||||||||
▲ | zahlman 5 days ago | parent | next [-] | |||||||||||||
> I saw it somewhere else recently, but the idea is that LLMs are language models, not world models. Part of what distinguishes humans from artificial "intelligence" to me is exactly that we automatically develop models of whatever is needed. | ||||||||||||||
| ||||||||||||||
▲ | foobarbecue 5 days ago | parent | prev | next [-] | |||||||||||||
On HN, perhaps? #17 on the front page right now: https://news.ycombinator.com/item?id=44854518 | ||||||||||||||
▲ | manbash 5 days ago | parent | prev | next [-] | |||||||||||||
Thanks for this. I was struggling to put it in words even if maybe this has been a known distinguishing factor for others. | ||||||||||||||
▲ | myhf 5 days ago | parent | prev | next [-] | |||||||||||||
9:05 is a good example of the difference between a language model and a world model, because engaging with it on a textual level leads to the bad ending (which the researchers have called "100%"), but deliberately getting the good ending requires self-awareness, intentionality, and/or outside context. | ||||||||||||||
▲ | lubujackson 5 days ago | parent | prev [-] | |||||||||||||
Why, this sounds like Context Engineering! |