Remix.run Logo
felipeerias 2 days ago

The discussion about “AGI” is somewhat pointless, because the term is nebulous enough that it will probably end up being defined as whatever comes out of the ongoing huge investment in AI.

Nevertheless, we don’t have a good conceptual framework for thinking about these things, perhaps because we keep trying to apply human concepts to them.

The way I see it, a LLM crystallises a large (but incomplete and disembodied) slice of human culture, as represented by its training set. The fact that a LLM is able to generate human-sounding language

roenxi 2 days ago | parent | next [-]

Not quite pointless - something we have established with the advent of LLMs is that many humans have not attained general intelligence. So we've clarified something that a few people must have been getting wrong, I used to think that the bar was set so that almost all humans met it.

Jensson 2 days ago | parent | next [-]

What do you mean? Almost every human can go to school and become a stable professional at some job, that is the bar to me, todays LLM cannot do that.

roenxi a day ago | parent [-]

LLMs are clever enough to hold down a professional job, and they've had far less time learning than the average human. If that is the bar then AGI has been achieved.

goatlover 2 days ago | parent | prev [-]

Almost all humans do things daily that LLMs don't. It's only if you define general intelligence to be proficiency at generating text instead of successfully navigating the world while pursuing goals such as friendships, careers, families, politics, managing health.

LLMs aren't Data (Star Trek) or Replicants (Blade Runner). They're not even David or the androids from the movie A.I.

idiotsecant 2 days ago | parent | prev | next [-]

I think it has a practical, easy definition. Can you drop an AI into a terminal, give it the same resources as a human, and reliably get independent work product greater than that human would produce across a wide domain? If so, it's an AGI.

alternatex 2 days ago | parent [-]

Doesn't sound like AGI without physical capabilities. It's not general if it's bound to digital work.

chipsrafferty 2 days ago | parent | next [-]

I think the intelligence is general if it can do any remote job that only requires digital IO.

It's general intelligence, not general humanity

idiotsecant 2 days ago | parent | prev [-]

Any AGI capable of this wouldn't have much trouble with physical operation of equipment, of all things.

lukebuehler 2 days ago | parent | prev [-]

I agree that the term can muddy the waters, but as a shorthand for roughly "an agent calling an LLM (or several LLMs) in a loop producing similar economic output as a human knowledge-worker", then it is useful. And if you pay attention to the AI leaders, then that's what the defintion has become.