▲ | losvedir 9 hours ago | |
> Continuous training is the key ingredient. Humans can use existing knowledge and apply it to new scenarios, and so can most AI. But AI cannot permanently remember the result of its actions in the real world, and so its body of knowledge cannot expand. I think it depends on how you look at it. I don't want to torture the analogy too much, but I see the pre-training (getting model weights out of an enormous corpus of text) as more akin to the billions of years of evolution that led to the modern human brain. The brain still has a lot to learn once you're born, but it already also has lots of structures (e.g. to handle visual input, language, etc) and built-in knowledge (instincts). And you can't change that over the course of your life. I wouldn't be surprised if we ended up in a "pre-train / RAG / context window" architecture of AI, analogously to "evolution / long term memory / short term memory" in humans. |