| ▲ | izzydata 4 hours ago | |
If we take that statement as fact then I don't believe we are even close to an LLM being sufficiently complex enough. However, I don't think it is even true. LLMs may not even be on the right track to achieving AGI and without starting from scratch down an alternate path it may never happen. LLMs to me seem like a complicated database lookup. Storage and retrieval of information is just a single piece of intelligence. There must be more to intelligence than a statistical model of the probable next piece of data. Where is the self learning without intervention by a human. Where is the output that wasn't asked for? At any rate. No amount of hype is going to get me to believe AGI is going to happen soon. I'll believe it when I see it. | ||
| ▲ | hackinthebochs 2 hours ago | parent [-] | |
>I'll believe it when I see it. And how will you know AGI when you saw it? | ||