Remix.run Logo
bogzz 3 hours ago

They could help you build an AGI if someone else has already built AGI and published it on GitHub.

unshavedyak 2 hours ago | parent [-]

I see this statement all the time and it's just strange to me. Yes, the LLMs struggle to form unique ideas - but so do we. Most advancements in human history are incremental. Built on the shoulders of millions of other incremental advancements.

What i don't understand is how we quantify our ability to actually create something novel, truly and uniquely novel. We're discussing the LLMs inability to do that, yet i don't feel i have a firm grasp on what we even possess there.

When pressed i imagine many folks would immediately jest that they can create something never done before, some weird random behavior or noise or drawing or whatever. However many times it's just adjacent to existing norms, or constrained by the inversion of not matching existing norms.

In a lot of cases our incremental novelties feel, to some degree, inevitable. As the foundations of advancement get closer to the new thing being developed it becomes obvious at times. I suspect this form of novelty is a thing LLMs are capable of.

So for me the real question is at what point is innovation so far ahead that it doesn't feel like it was the natural next step. And of course, are LLMs capable of doing this?

I suspect for humans this level of true innovation is effectively random. A genius being more likely to make these "random" connections because they have more data to connect with. But nonetheless random, as ideas of this nature often come without explanation if not built on the backs of prior art.

So yea.. thoughts?

bogzz an hour ago | parent [-]

I really love Andrej Karpathy's take on LLMs as being instead of intelligence or sentience, a kind of cortical tissue.

It should be clear from working with LLMs over the past 4 years that they are not consciousness.

Andrej's appearance on the Dwarkesh podcast is great.

unshavedyak 13 minutes ago | parent [-]

To be clear i agree with you, my question is more pointed at us - i'm not sure we have a good understanding of conciousness, nor that we are as we seem. Given how prone to hallucinations we are, how our subtle hormones can drastically alter what we perceive as our intelligence, self identity, etc.

I'm not convinced LLMs are anything amazing in their current form, but i suspect they'll push a self reflection on us.

But clearly i think humans are far more Input-Output than the average person. I'm also not educated on the subject, so what do i know hah.