▲ | shkkmo 6 days ago | |||||||||||||||||||||||||||||||||||||||||||
AI has had many, many lay meanings over the years. Simplistic decision trees and heuristics for video games is called AI. It is a loose term and trying to apply it with semantic rigour is useless, as is trying to tell people that it should only be used to match one of its many meanings. If you want some semantic rigour use more specific terms like AGI, human equivalent AGI, super human AGI, exponentially self improving AGI, etc. Even those labels lack rigour, but at least they are less ambiguous. LLMs are pretty clearly AI and AGI under commonly understood, lay definitions. LLMs are not human level AGI and perhaps will never be by themselves. | ||||||||||||||||||||||||||||||||||||||||||||
▲ | parineum 5 days ago | parent [-] | |||||||||||||||||||||||||||||||||||||||||||
> LLMs are pretty clearly AI and AGI under commonly understood, lay definitions. That's certainly not clear. For starters, I don't think there is a lay definition of AGI which is largely my point. The only reason people are willing to call LLMs AI is because that's how they are being sold and the shine isn't yet off the rose. How many people call Siri AI? It used to be but people have had time to feel around the edges where it fails to meet their expectations of AI. You can tell what people think of AI by the kind of click bait surrounding LLMs. I read an article not too long ago with the headline about an LLM lying to try and not be turned off. Turns out it was intentionally prompted to do that but the point is that that kind of self preservation is what people expect of AI. Implicitly, they expect that AI has a "self". ChatGPT doesn't have a self. | ||||||||||||||||||||||||||||||||||||||||||||
|