Remix.run Logo
abraxas 6 days ago

> it's pretty clear at this point that LLMs are never going to lead to general intelligence.

It is far from clear. There may well be emergent hierarchies of more abstract thought at much higher numbers of weights. We just don't know how a transformer will behave if one is built with 100T connections - something that would finally approach the connectome level of a human brain. Perhaps nothing interesting but we just do not know this and the current limitation in building such a beast is likely not software but hardware. At these scales the use of silicon transistors to approximate analog curve switching models just doesn't make sense. True neuromorphic chips may be needed to approach the numbers of weights necessary for general intelligence to emerge. I don't think there is anything in production at the moment that could rival the efficiency of biological neurons. Most likely we do not need that level of efficiency. But it's almost certain that stringing together a bunch of H100s isn't a path to the scale we should be aiming for.