Remix.run Logo
altruios 2 days ago

>If we go with this definition instead

...Let's not go with the nonsense definitions then.

I agree, systems don't need a brain to be intelligent, and (on a related point:) I don't think systems need to be conscious to be 'intelligent'.

You are excluding this system (llm+harness) that learns (separately), can modify it's surrounding environment via a shell interface (including setting up a nightly training loop to reweight itself based on it's daily actions and interactions) from being intelligent. Do I have that right? Or are you thinking in terms of 'only' the LLM?

gslepak 2 days ago | parent [-]

I do call openclaw style agents "living agents", although they might be closer to a kind of zombie. Living agents like openclaw et. al. do have a self-modifying property of sorts thanks to their memory, and so that system might be more AGI-ish, but, still, it has a fundamental cap to its potential, which remains frozen at the LLM.

> (including setting up a nightly training loop to reweight itself based on it's daily actions and interactions) from being intelligent

I'd have a harder time arguing that sort of system isn't AGI.