▲ | nlitened 13 hours ago | ||||||||||||||||||||||
> the human brain, of which we don't have a clear understanding of the compute capacity Neurons have finite (very low) speed of signal transfer, so just by measuring cognitive reaction time we can deduce upper bounds on how many _consecutive_ neuron connections are involved in reception, cognitive processing, and resulting reaction via muscles, even for very complex cognitive processes. And the number is just around 100 consecutive neurons involved one after another. So “the algorithm” could not be _that_ complex in the end (100x matmul+tanh?) Granted, a lot of parallelism and feedback loops are involved, but overall it gives me (and many others) an impression that when the AGI algorithm is ever found, it’s “mini” version should be able to run on modest 2025 hardware in real time. | |||||||||||||||||||||||
▲ | johnb231 12 hours ago | parent | next [-] | ||||||||||||||||||||||
> (100x matmul+tanh?) Biological neurons are way more complex than that. A single neuron has dentritic trees with subunits doing their own local computations. There are temporal dynamics in the firing sequences. There is so much more complexity in the biological networks. It's not comparable. | |||||||||||||||||||||||
| |||||||||||||||||||||||
▲ | scajanus 12 hours ago | parent | prev [-] | ||||||||||||||||||||||
The granted is doing a lot of work there. In fact, if you imagine a computer being able to do similar tasks as human brain can in around 100 steps, it becomes clear that considering parallelism is absolutely critical. |