Remix.run Logo
parineum 5 days ago

> LLMs are pretty clearly AI and AGI under commonly understood, lay definitions.

That's certainly not clear. For starters, I don't think there is a lay definition of AGI which is largely my point.

The only reason people are willing to call LLMs AI is because that's how they are being sold and the shine isn't yet off the rose.

How many people call Siri AI? It used to be but people have had time to feel around the edges where it fails to meet their expectations of AI.

You can tell what people think of AI by the kind of click bait surrounding LLMs. I read an article not too long ago with the headline about an LLM lying to try and not be turned off. Turns out it was intentionally prompted to do that but the point is that that kind of self preservation is what people expect of AI. Implicitly, they expect that AI has a "self".

ChatGPT doesn't have a self.

shkkmo 5 days ago | parent [-]

AI and AGI are broad umbrella terms. Stuff like Alpha Zero is AI but not AGI while LLMs are both.

Engaging in semantic battles to try to change the meanings of those terms is just going to create more confusion, not less. Instead why not use more specific and descriptive labels to be clear about what you are saying.

Self-Aware AGI, Human Level AGI, Super-Human ANI, are all much more useful than trying to force general label to be used a specific way.

parineum 5 days ago | parent [-]

> Engaging in semantic battles to try to change the meanings of those terms is just going to create more confusion

You're doing that. I've never seen someone state, as fact, that LLMs are AGI before now. Go ask someone on the street what Super-Human ANI means.

shkkmo 5 days ago | parent [-]

> I've never seen someone state, as fact, that LLMs are AGI before now.

Then you probably haven't been paying attention.

https://deepmind.google/research/publications/66938/

> I've never seen someone state, as fact, that LLMs are AGI before now.

Many LLMs are AI that weren't designed / trained to solve a narrow problem scope. They can complete a wide range of tasks with varying levels of proficiency. That makes them artificial general intelligence or AGI.

You are confused because lots of people use "AGI" as a shorthand to talk about "human level" AGI that isn't limited to a narrow problem scope.

It's not wrong to use the term this way, but it is ambiguous and vague.

Even the term "human level" is poorly defined and if I wanted to use the term "Human level AGI" for any kind of discussion of what qualifies, I'd need to specify how I was defining that.

parineum 5 days ago | parent [-]

I'm not confused at all. Your own personal definitions just further my point that tech people have a much different classification system that the general populous and that the need for those excessive classifications is that way ambitious CEOs keep using the term incorrectly in order to increase share prices.

It's actually very funny to me that you are stating these definitions so authoritatively despite the terms not having any sort if rigor attached to either their definition or usage.

shkkmo 3 days ago | parent [-]

> It's actually very funny to me that you are stating these definitions so authoritatively despite the terms not having any sort if rigor attached to either their definition or usage.

Huh? My entire point was that AI and AGI are loose, vague terms and if you want to be clear about what you are talkng about, you should use more specific terms.