▲ | Vegenoid 7 days ago | |||||||||||||
> If you want to be pedantic about word definitions, it absolutely is AGI: artificial general intelligence. This isn't being pedantic, it's deliberately misinterpreting a commonly used term by taking every word literally for effect. Terms, like words, can take on a meaning that is distinct from looking at each constituent part and coming up with your interpretation of a literal definition based on those parts. | ||||||||||||||
▲ | adastra22 7 days ago | parent [-] | |||||||||||||
I didn't invent this interpretation. It's how the word was originally defined, and used for many, many decades, by the founders of the field. See for example: https://www-formal.stanford.edu/jmc/generality.pdf Or look at the old / early AGI conference series: Or read any old, pre-2009 (ImageNet) AI textbook. It will talk about "narrow intelligence" vs "general intelligence," a dichotomy that exists more in GOFAI than the deep learning approaches. Maybe I'm a curmudgeon and this is entering get-off-my-lawn territory, but I find it immensely annoying when existing clear terminology (AGI vs ASI, strong vs weak, narrow vs. general) is superseded by a confused mix of popular meanings that lack any clear definition. | ||||||||||||||
|