▲ | mdp2021 2 days ago | |||||||
Same thing: we create models about how to solve the problem, not biomimicry models about how natural entities solve the problem - these are not necessary. They are on a lower layer in the stack. | ||||||||
▲ | egg1 2 days ago | parent [-] | |||||||
Except that doesn't make sense if you can't articulate what the problem space is. We know arithmetic inside and out, and we've understood how to make mechanical calculators centuries before the dawn of electronics. "Intelligence" on the other hand is a nebulously defined philosophical concept. What I see with these attempts at AGI is VC-funded circuses throwing shit at the wall, hardly checking to see if it sticks, and then heaping more on top. Nobody can explain how exactly transformer models are the building blocks of intelligence or how building on top of it will lead to real intelligence. | ||||||||
|