Remix.run Logo
saberience 2 days ago

I hate when people bring up this “billions of years of evolution” idea. It’s completely wrong and deluded in my opinion.

Firstly humans have not been evolving for “billions” of years.

Homo sapiens have been around for maybe 300’000 years, and the “homo” genus has been 2/3 million years. Before that we were chimps etc and that’s 6/7 million years ago.

If you want to look at the entire brain development, ie from mouse like creatures through to apes and then humans that’s 200M years.

If you want to think about generations it’s only 50/75M generations, ie “training loops”.

That’s really not very many.

Also the bigger point is this, for 99.9999% of that time we had no writing, or any kind of complex thinking required.

So our ability to reason about maths, writing, science etc is only in the last 2000-2500 years! Ie only roughly 200 or so generations.

Our brain was not “evolved” to do science, maths etc.

Most of evolution was us running around just killing stuff and eating and having sex. It’s only a tiny tiny amount of time that we’ve been working on maths, science, literature, philosophy.

So actually, these models have a massive, massive amount of training more than humans had to do roughly the same thing but using insane amounts of computing power and energy.

Our brains were evolved for a completely different world and environment and daily life that the life we lead now.

So yes, LLMs are good, but they have been exposed to more data and training time than any human could have unless we lived for 100000 years and still perform worse than we do in most problems!

hodgehog11 2 days ago | parent | next [-]

Okay, fine, let's remove the evolution part. We still have an incredible amount of our lifetime spent visualising the world and coming to conclusions about the patterns within. Our analogies are often physical and we draw insights from that. To say that humans only draw their information from textbooks is foolhardy; at the very least, you have to agree there is much more.

I realise upon reading the OP's comment again that they may have been referring to "extrapolation", which is hugely problematic from the statistical viewpoint when you actually try to break things down.

My argument for compression asserts that LLMs see a lot of knowledge, but are actually quite small themselves. To output a vast amount of information in such a small space requires a large amount of pattern matching and underlying learned algorithms. I was arguing that humans are actually incredible compressors because we have many years of history in our composition. It's a moot point though, because it is the ratio of output to capacity that matters.

vrighter 10 hours ago | parent [-]

They can't learn iterative algorithms if they cannot execute loops. And blurting out an output which we then feed back in does not count as a loop. That's a separate invocation with fresh inputs, as far as the system is concerned.

They can attempt to mimic the results for small instances of the problem, where there are a lot of worked examples in the dataset, but they will never ever be able to generalize and actually give the correct output for arbitrary sized instances of the problem. Not with current architectures. Some algorithms simply can't be expressed as a fixed-size matrix multiplication.

GoblinSlayer 2 days ago | parent | prev | next [-]

>Most of evolution was us running around just killing stuff and eating and having sex.

Tell Boston Dynamics how to do that.

Mice inherited brain from their ancestors. You might think you don't need a working brain to reason about math, but that's because you don't know how thinking works, it's argument from ignorance.

saberience 2 days ago | parent [-]

You've missed the point entirely.

People argue that humans have had the equivalent of training a frontier LLM for billions of years.

But training a frontier LLM involves taking multiple petabytes of data, effectively all of recorded human knowledge and experience, every book ever written, every scientific publication ever written, all of known maths, science, encylopedias, podcasts, etc. And then training that for millions of years worth of GPU-core time.

You cannot possibly equate human evolution with LLM training, it's ridiculous.

Our "training" time didn't involve any books, maths, science, reading, 99.9999% of our time was just in the physical world. So you can quite rationally argue that our brains ability to learn without training is radically better and more efficient that the training we do for LLMs.

Us running around in the jungle wasn't training our brain to write poetry or compose music.

dwaltrip 2 days ago | parent [-]

> Us running around in the jungle wasn't training our brain to write poetry or compose music.

This is a crux of your argument, you need to justify it. It sounds way off base to me. Kinda reads like an argument from incredulity.

KalMann 2 days ago | parent | next [-]

No, I think what he said was true. Human brains have something about them that allow for the invention of poetry or music. It wasn't something learned through prior experience and observation because there aren't any poems in the wild. You might argue there's something akin to music, but human music goes far beyond anything in nature.

hodgehog11 2 days ago | parent [-]

We have an intrinsic (and strange) reward system for creating new things, and it's totally awesome. LLMs only started to become somewhat useful once researchers tried to tap in to that innate reward system and create proxies for it. We definitely have not succeeded in creating a perfect mimicry of that system though, as any alignment researcher would no doubt tell you.

saberience 2 days ago | parent | prev [-]

So you're arguing that "running around in the jungle" is equivalent to feeding the entirety of human knowledge in LLM training?

Are you suggesting that somehow there were books in the jungle, or perhaps boardgames? Perhaps there was a computer lab in the jungle?

Were apes learning to conjugate verbs while munching on bananas?

I don't think I'm suggesting anything crazy here... I think people who say LLM training is equivalent to "billions of years of evolution" need to justify that argument far more than I need to justify that running around in the jungle is equivalent to mass processing petabytes of highly rich and complex dense and VARIED information.

One year of running around in the same patch of jungle, eating the same fruit, killing the same insects, and having sex with the same old group of monkeys isn't going to be equal to training with the super varied, complete, entirety of human knowledge, is it?

If you somehow think it is though, I'd love to hear your reasoning.

hodgehog11 2 days ago | parent | next [-]

There is no equivalency, only contributing factors. One cannot deny that our evolutionary history has contributed to our current capacity, probably in ways that are difficult to perceive unless you're an anthropologist.

Language is one mode of expression, and humans have many. This is another factor that makes humans so effective. To be honest, I would say that physical observation is far more powerful than all the bodies of text, because it is comprehensive and can respond to interaction. But that is merely my opinion.

No-one should be arguing that an LLM training corpus is the same as evolution. But information comes in many forms.

chipsrafferty 2 days ago | parent | prev [-]

You're comparing the hyper specific evolution of 1 individual (an AI system) to the more general evolution of the entire human species (billions of individuals). It's as if you're forgetting how evolution actually works - natural selection - and forgetting that when you have hundreds of billions of individuals over thousands of years that even small insights gained from "running around in the jungle" can compound in ways that are hard to conceptualize.

I'm saying that LLM training is not equivalent to billions of years of evolution because LLMs aren't trained using evolutionary algorithms; there will always be fundamental differences. However, it seems reasonable to think that the effect of that "training" might be more or less around the same level.

Ajakks 2 days ago | parent | prev | next [-]

Im so confused as to how you think you can cut an endless chain at the mouse.

Were mammals the first thing? No. Earth was a ball of ice for a billion years - all life at that point existed solely around thermal vents at the bottom of the oceans... that's inside of you, too.

Evolution doesn't forget - everything that all life has ever been "taught" (violently had programmed into us over incredible timelines) all that has ever been learned in the chain of DNA from the single cell to human beings - its ALL still there.

smohare 2 days ago | parent | prev [-]

[dead]