Remix.run Logo
Flashtoo 2 days ago

The notion that the brain uses less energy than an incandescent lightbulb and can store less data than YouTube does not mean we have had the compute and data needed to make AGI "for a very long time".

The human brain is not a 20-watt computer ("100 watts per day" is not right) that learns from scratch on 2 petabytes of data. State manipulations performed in the brain can be more efficient than what we do in silicon. More importantly, its internal workings are the result of billions of years of evolution, and continue to change over the course of our lives. The learning a human does over its lifetime is assisted greatly by the reality of the physical body and the ability to interact with the real world to the extent that our body allows. Even then, we do not learn from scratch. We go through a curriculum that has been refined over millennia, building on knowledge and skills that were cultivated by our ancestors.

An upper bound of compute needed to develop AGI that we can take from the human brain is not 20 watts and 2 petabytes of data, it is 4 billion years of evolution in a big and complex environment at molecular-level fidelity. Finding a tighter upper bound is left as an exercise for the reader.

netdevphoenix 2 days ago | parent [-]

> it is 4 billion years of evolution in a big and complex environment at molecular-level fidelity. Finding a tighter upper bound is left as an exercise for the reader.

You have great points there and I agree. Only issue I take with your remark above. Surely, by your own definition, this is not true. Evolution by natural selection is not a deterministic process so 4 billion years is just one of many possible periods of time needed but not necessarily the longest or the shortest.

Also, re "The human brain is not a 20-watt computer ("100 watts per day" is not right)", I was merely saying that there exist an intelligence that consumes 20 watts per day. So it is possible to run an intelligence on that much energy per day. This and the compute bit do not refer to the training costs but to the running costs after all, it will be useless to hit AGI if we do not have enough energy or compute to run it for longer than half a millisecond or the means to increase the running time.

Obviously, the path to design and train AGI is going to take much more than that just like the human brain did but given that the path to the emergence of the human brain wasn't the most efficient given the inherent randomness in evolution natural selection there is no need to pretend that all the circumstances around the development of the human brain apply to us as our process isn't random at all nor is it parallel at a global scale.

Flashtoo 2 days ago | parent | next [-]

> Evolution by natural selection is not a deterministic process so 4 billion years is just one of many possible periods of time needed but not necessarily the longest or the shortest.

That's why I say that is an upper bound - we know that it _has_ happened under those circumstances, so the minimum time needed is not more than that. If we reran the simulation it could indeed very well be much faster.

I agree that 20 watts can be enough to support intelligence and if we can figure out how to get there, it will take us much less time than a billion years. I also think that on the compute side for developing the AGI we should count all the PhD brains churning away at it right now :)

recursive 2 days ago | parent | prev [-]

"watts per day" is just not a sensible metric. watts already has the time component built in. 20 watts is a rate of energy usage over time.