| ▲ | bigyabai 2 hours ago | ||||||||||||||||||||||||||||||||||
> Human brain isn't the only way to implement general intelligence - just the one that was the easiest for evolution to put together out of what it had. The human brain is not a pretrained system. It's objectively more flexible than than transformers and capable of self-modulation in ways that no ML architecture can replicate (that I'm aware of). | |||||||||||||||||||||||||||||||||||
| ▲ | ACCount37 2 hours ago | parent [-] | ||||||||||||||||||||||||||||||||||
Human brain's "pre-training" is evolution cramming way too much structure into it. It "learns from scratch" the way it does because it doesn't actually learn from scratch. I've seen plenty of wacky test-time training things used in ML nowadays, which is probably the closest to how the human brain learns. None are stable enough to go into the frontier LLMs, where in-context learning still reigns supreme. In-context learning is a "good enough" continuous learning approximatation, it seems. | |||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||