▲ | zppln 6 days ago | ||||||||||||||||
> clear path to AGI What are the steps? | |||||||||||||||||
▲ | lisper 5 days ago | parent [-] | ||||||||||||||||
It's not really about "steps", it's about getting the architecture right. LLMs by themselves are missing two crucial ingredients: embodiment and feedback. The reason they hallucinate is that they have no idea what the words they are saying mean. They are like children mimicking other people. They need to be able to associate the words with some kind of external reality. This could be either the real world, or a virtual world, but they need something that establishes an objective reality. And then they need to be able to interact with that world, poke at it and see what it does and how it behaves, and get feedback regarding whether their actions were appropriate or not. If I were doing this work, I'd look at a rich virtual environment like Minecraft or simcity or something like that. But it could also be coq or a code development environment. | |||||||||||||||||
|