| ▲ | zby 5 hours ago | |||||||||||||
My bet is that the solution to continuous learning is with external storage. There is a lot of talk about context engineering - but I have not seen anyone taking context as the main bottleneck and building a system around that. This would show that even context engineering is kind of wrong term - because context does not enter the llm in some mysterious way - it goes through prompt and the whole model of passing chat history back and forth is not the most efficient way of using the prompt limitation. | ||||||||||||||
| ▲ | mhl47 4 hours ago | parent | next [-] | |||||||||||||
"External Storage" whatever that is can not be the same as continous learning as it does not have the strong connections/capture the interdepencies of knowledge. That said I think we will see more efforts also on the business side to have models that can help you build a knowledge base in some kind of standardized way that the model is trained to read. Or synthesize some sort on instructions how to navigate your knowledge base. Currently e.g. Copilot tries to navigate a hot mess of a MS knowledge graph that is very different for each company. And due to its amnesia it has to repeat the discovery in every session. No wonder that does not work. We have to either standardize or store somewhere (model, instructions) how to find information efficiently. | ||||||||||||||
| ||||||||||||||
| ▲ | Centigonal 5 hours ago | parent | prev [-] | |||||||||||||
What do you mean when you say "external storage?" | ||||||||||||||
| ||||||||||||||