| ▲ | netdevphoenix 2 days ago | ||||||||||||||||||||||
> The current generation of LLM's have convinced me that we already have the compute and the data needed for AGI, we just likely need a new architecture This is likely true but not for the reasons you think about. This was arguably true 10 years ago too. A human brain uses 100 watts per day approx and unlike most models out there, the brain is ALWAYS in training mode. It has about 2 petabytes of storage. In terms of raw capabilities, we have been there for a very long time. The real challenge is finding the point where we can build something that is AGI level with the stuff we have. Because right now, we might have the compute and data needed for AGI but we might lack the tools needed to build a system that efficient. It's like a little dog trying to enter a fenced house, the closest path topologically between the dog and the house might not be accessible for that dog at that point because its current capabilities (short legs, inability to jump high or push through the fence standing in between) so while it is further topologically, a longer path topologically might be the closest path to reach the house. In case it's not obvious, AGI is the house, we are the little dog and the fence represent current challenges to build AGI. | |||||||||||||||||||||||
| ▲ | Flashtoo 2 days ago | parent [-] | ||||||||||||||||||||||
The notion that the brain uses less energy than an incandescent lightbulb and can store less data than YouTube does not mean we have had the compute and data needed to make AGI "for a very long time". The human brain is not a 20-watt computer ("100 watts per day" is not right) that learns from scratch on 2 petabytes of data. State manipulations performed in the brain can be more efficient than what we do in silicon. More importantly, its internal workings are the result of billions of years of evolution, and continue to change over the course of our lives. The learning a human does over its lifetime is assisted greatly by the reality of the physical body and the ability to interact with the real world to the extent that our body allows. Even then, we do not learn from scratch. We go through a curriculum that has been refined over millennia, building on knowledge and skills that were cultivated by our ancestors. An upper bound of compute needed to develop AGI that we can take from the human brain is not 20 watts and 2 petabytes of data, it is 4 billion years of evolution in a big and complex environment at molecular-level fidelity. Finding a tighter upper bound is left as an exercise for the reader. | |||||||||||||||||||||||
| |||||||||||||||||||||||