▲ | lukeschlather 5 days ago | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
This is a really good overview, and it seems remarkably not needing much modification after several decades, at least in terms of the facts and things it predicts everything has happened as the author says. I do want to pick at some of the numbers in the upper bound because obviously we're getting close to the end of the first third of the century and we don't have ASI yet even though we have roughly hit the upper bound the author defines. > Since a signal is transmitted along a synapse, on average, with a frequency of about 100 Hz and since its memory capacity is probably less than 100 bytes (1 byte looks like a more reasonable estimate) I admit my feeling is that neurons/synapses probably have less than 100 bytes of memory, and also that a byte or less is more plausible, but I would like to see some more rigorous proof that they can't possibly have more than a gigabyte of memory that the synapse/neuron can access at the speed of computation. The author has a note where they handwave away the possibility that chemical processes could meaningfully increase the operations per second, and I'm comfortable with that, but this point: > Perhaps a more serious point is that that neurons often have rather complex time-integration properties Seems more interesting. Especially in the context of if there's dramatically more storage available in neurons/synapses. If a neuron can do maybe some operations per minute over 1GB of data per synapse, for example. (Which sounds absurdly high, but just for the sake of argument.) And I think putting some absurdly generous upper bounds in might be helpful since, we're clearly past the 100TOPs, asking, like, how many H100s would you need if we made some absurd suppositions about the capacity of human synapses and neurons? It seems like, we probably have enough. But also I think you could make a case some of the largest supercomputing clusters are the only things that can actually match the upper bound for the capacity of a single human brain. Although I think someone might be able to convince me that a manageable cluster of H100s already meets the most generous possible upper bound. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
▲ | kelseyfrog 5 days ago | parent | next [-] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
A 5090 has a peak theoretical limit of GenAI 3356TOPS. So we're "already" an order of magnitude greater than what was considered enough for AGI. One question is, "What happened here?" Was the original estimate wrong? Have we notfound the "right" algorithm yet? Something else? | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
▲ | RaftPeople 5 days ago | parent | prev | next [-] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
> I admit my feeling is that neurons/synapses probably have less than 100 bytes of memory, and also that a byte or less is more plausible, but I would like to see some more rigorous proof that they can't possibly have more than a gigabyte of memory that the synapse/neuron can access at the speed of computation. Based on lots of reading about brain research and the relentless flow of new and unknown things that need further research, my personal gut feel is that the estimates in that paper about brain computational ability don't really have a valid foundation. There are too many things discovered since then and too many things still not understood. Some interesting items: 1-Astrocytes are computational cells which need to be included in the math. They have internal calcium waves localized in their processes as well as across the entire cell and inter cell. 2-Recent research showed neuron signal timing down to the millisecond level carries information. 3-Individual cells (neurons and non-neurons) learn, they don't require a synapse and external cell for that capability 4-Neurons are influenced by the electromagnetic field around them and somehow that influence would need to be included in a calc on information flow | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
▲ | AIPedant 4 days ago | parent | prev | next [-] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
I think we are severely underestimating the computational complexity of animal brains by looking at short-term reactions and snap judgements, not deep thinking or long-term learning. Axons transmit electrical signals and that's what Bostrom is taking to be an "op." But they also transmit vesicles of mRNA and proteins directly from the cytoplasm of one neuron into another, which is an "op" of unimaginable complexity compared to a neuron simply firing (or any CPU instruction), and we have no clue what that means for cognition. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
▲ | tim333 5 days ago | parent | prev [-] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Re the capabilities of neurons, the argument in Moravec's paper seem quite solid, comparing the capabilities of a bit of the brain we understand quite well, the retina, to computer programs doing the same function. My feeling is we have enough compute for ASI already but not algorithms like the brain. I'm not sure if it'll get solved by smart humans analysing it or by something like AlphaEvolve (https://news.ycombinator.com/item?id=43985489). One advantage of computers being much quicker than needed is you can run lots of experiments. Just the power requirements make me think current algorithms are pretty inefficient compared to the brain. |