Remix.run Logo
tshaddox 3 days ago

That sounds like you’re describing AGI as being impractical to implement in an electronic computer, not impossible in principle.

epiccoleman 3 days ago | parent [-]

Yeah, I guess I'm not taking a stance on that above, just wondering where in that chain holds the most explanatory power for intelligence and/or consciousness.

I don't think there's any real reason to think intelligence depends on "meat" as its substrate, so AGI seems in principle possible to me.

Not that my opinion counts for much on this topic, since I don't really have any relevant education on the topic. But my half baked instinct is that LLMs in and of themselves will never constitute true AGI. The biggest thing that seems to be missing from what we currently call AI is memory - and it's very interesting to see how their behavior changes if you hook up LLMs to any of the various "memory MCP" implementations out there.

Even experimenting with those sorts of things has left me feeling there's still something (or many somethings) missing to take us from what is currently called "AI" to "AGI" or so-called super intelligence.

kelnos 3 days ago | parent | next [-]

> I don't think there's any real reason to think intelligence depends on "meat" as its substrate

This made me think of... ok, so let's say that we discover that intelligence does indeed depend on "meat". Could we then engineer a sort of organic computer that has general intelligence? But could we also claim that this organic computer isn't a computer at all, but is actually a new genetically engineered life form?

mindcrime 3 days ago | parent | prev [-]

But my half baked instinct is that LLMs in and of themselves will never constitute true AGI.

I agree. But... LLM's are not the only game in town. They are just one approach to AI that is currently being pursued. The current dominant approach by investment dollars, attention, and hype, to be sure. But still far from the only thing around.