▲ | HarHarVeryFunny 10 hours ago | |||||||
The name AGI (i.e. generalist AI) was originally intended to contrast with narrow AI which is only capable of one, or a few, specific narrow skills. A narrow AI might be able to play chess, or distinguish 20 breeds of dog, but wouldn't be able to play tic tac toe because it wasn't built for that. AGI would be able to learn to do anything, within reason. The term AGI is obviously used very loosely with little agreement to it's precise definition, but I think a lot of people take it to mean not only generality, but specifically human-level generality, and human-level ability to learn from experience and solve problems. A large part of the problem with AGI being poorly defined is that intelligence itself is poorly defined. Even if we choose to define AGI as meaning human-level intelligence, what does THAT mean? I think there is a simple reductionist definition of intelligence (as the word is used to refer to human/animal intelligence), but ultimately the meaning of words are derived from their usage, and the word "intelligence" is used in 100 different ways ... | ||||||||
▲ | mrandish 5 hours ago | parent [-] | |||||||
> intended to contrast with narrow AI I've thought for a while that the middle letter in AGI ('General' vs 'Specific') would be more useful and helpful if it were changed to Wide vs Narrow. All AIs can be evaluated on a scale of narrow to wide in terms of their abilities and I don't think that will change anytime soon. Everyone understands that something is only wide or narrow in comparison to something else. While that's also true of the terms "general' and 'specific', those are less used that way in daily conversation these days. In science and tech we make distinctions about generalized vs specific but 'general' isn't a conversational term like 50 or 100 years ago. When I was a kid my grandparents would call the local supermarket, the 'general store' which I thought was an unusual usage even then. | ||||||||
|