▲ | danenania 3 days ago | ||||||||||||||||
I’m not sure that self-updating weights is really analogous to “continuous learning” as humans do it. A memory data structure that the model can search efficiently might be a lot closer. Self-updating weights could be more like epigenetics. | |||||||||||||||||
▲ | Jensson 3 days ago | parent | next [-] | ||||||||||||||||
Human neurons are self updating though, we aren't running on our genes each cell is using our genes to determine how to connect to other cells and then the cell learns how to process some information there based on what it hears from its connected cells. So, genes would be a meta model that then updates weights in the real model so it can learn how to process new kinds of things, and for stuff like facts you can use an external memory just like humans does. Without updating the weights in the model you will never be able to learn to process new things like a new kind of math etc, since you learn that not by memorizing facts but by making new models for it. | |||||||||||||||||
▲ | HarHarVeryFunny 2 days ago | parent | prev | next [-] | ||||||||||||||||
There's a difference between memory and learning. Would you rather your illness was diagnosed by a doctor or by a plumber with access to a stack of medical books ? Learning is about assimilating lots of different sources of information, reconciling the differences, trying things out for yourself, learning from your mistakes, being curious about your knowledge gaps and contradictions, and ultimately learning to correctly predict outcomes/actions based on everything you have learnt. You will soon see the difference in action as Anthropic apparently agree with you that memory can replace learning, and are going to be relying on LLMs with longer compressed context (i.e. memory) in place of ability to learn. I guess this'll be Anthropic's promised 2027 "drop-in replacement remote worker" - not an actual plumber unfortunately (no AGI), but an LLM with a stack of your company's onboarding material. It'll have perfect (well, "compressed") recall of everything you've tried to teach it, or complained about, but will have learnt nothing from that. | |||||||||||||||||
| |||||||||||||||||
▲ | imtringued 2 days ago | parent | prev [-] | ||||||||||||||||
In spiking neural networks, the model weights are equivalent to dendrites/synapses, which can form anew and decay during your lifetime. |