Remix.run Logo
mikewarot 4 hours ago

I think that AGI has already happened, but it's not well understood, nor well distributed yet.

OpenClaw, et al, are one thing that got me nudged a little bit, but it was Sammy Jankis[1,2] that pushed me over the edge, with force. It's janky as all get out, but it'll learn to build it's own memory system on top of an LLM which definitely forgets.

[1] https://sammyjankis.com/

[2] https://news.ycombinator.com/item?id=47018100

dimitri-vs an hour ago | parent | next [-]

I really don't see why AGI can't be a spectrum and we just have very weak AGI and going from weak to strong will take many years, if it ever happens.

hermitShell 3 hours ago | parent | prev [-]

The Sammy Jankis link was certainly interesting. Thanks for sharing.

Whether or not AGI is imminent, and whether or not Sammy Jankis is or will be conscious... it's going to become so close that for most people, there will be no difference except to philosophers.

Is AGI 'right around the corner' or currently already achieved? I agree with the author, no, we have something like 10 years to go IMO. At the end of the post he points to the last 30 years of research, and I would accept that as an upper bound. In 10 to 30 years, 99% of people won't be able to distinguish between an 'AGI' and another person when not in meatspace.