▲ | parineum 5 days ago | ||||||||||||||||||||||||||||||||||
> LLMs are pretty clearly AI and AGI under commonly understood, lay definitions. That's certainly not clear. For starters, I don't think there is a lay definition of AGI which is largely my point. The only reason people are willing to call LLMs AI is because that's how they are being sold and the shine isn't yet off the rose. How many people call Siri AI? It used to be but people have had time to feel around the edges where it fails to meet their expectations of AI. You can tell what people think of AI by the kind of click bait surrounding LLMs. I read an article not too long ago with the headline about an LLM lying to try and not be turned off. Turns out it was intentionally prompted to do that but the point is that that kind of self preservation is what people expect of AI. Implicitly, they expect that AI has a "self". ChatGPT doesn't have a self. | |||||||||||||||||||||||||||||||||||
▲ | shkkmo 5 days ago | parent [-] | ||||||||||||||||||||||||||||||||||
AI and AGI are broad umbrella terms. Stuff like Alpha Zero is AI but not AGI while LLMs are both. Engaging in semantic battles to try to change the meanings of those terms is just going to create more confusion, not less. Instead why not use more specific and descriptive labels to be clear about what you are saying. Self-Aware AGI, Human Level AGI, Super-Human ANI, are all much more useful than trying to force general label to be used a specific way. | |||||||||||||||||||||||||||||||||||
|