| ▲ | delichon a day ago |
| > Miss those, and you're not maximally useful. And if it's not maximally useful, it's by definition not AGI. I know hundreds of natural general intelligences who are not maximally useful, and dozens who are not at all useful. What justifies changing the definition of general intelligence for artificial ones? |
|
| ▲ | throw310822 20 hours ago | parent | next [-] |
| At some point "general AI" stopped being the opposite of "narrow AI", that is AI specialised for a single task (e.g. speech or handwriting recognition, sentiment analysis, protein folding, etc.) and became practically synonymous with superintelligence. ChatGPT 3.5 is already a general AI based on the old definition, as it is already able to perform a variety of tasks without any specific pre-training. |
| |
| ▲ | marcosdumay 19 hours ago | parent [-] | | > ChatGPT 3.5 is already a general AI based on the old definition It's not. It's a query-retrieval system that can parse human language. Just like every LLM. | | |
| ▲ | fl7305 17 hours ago | parent | next [-] | | >> ChatGPT 3.5 is already a general AI based on the old definition > It's not. It's a query-retrieval system that can parse human language. And humans aren't general AI either. They're just DNA replicators. It is very obvious when you realize that humans weren't designed to be intelligent. They were just randomly iterated through an environment which selected for maximum DNA replication. Until you have a higher being which explicitly designs for intelligence, you'll just get things like LLM query-retrievals, or DNA replicators. | |
| ▲ | delichon 19 hours ago | parent | prev | next [-] | | It's a device for channeling the intelligence inherent in human language. The fact that its intelligence is located more in its human data than its artificial algorithms doesn't make its output less generally intelligent. | |
| ▲ | throw310822 19 hours ago | parent | prev [-] | | > It's a query-retrieval system that can parse human language I can't help being astounded by the confidence with which humans hallucinate completely improbable explanations for phenomena they don't understand at all. |
|
|
|
| ▲ | GavCo 19 hours ago | parent | prev | next [-] |
| Author here, thanks for the input. Agree that this bit was clunky. I made an edit to avoid unnecessarily getting into the definition of AGI here and added a note |
|
| ▲ | jwpapi 20 hours ago | parent | prev | next [-] |
| Yes exactly that sentence led me to step out of the article. This sentence is wrong in many ways and doesn’t give me trust in OPs opinion nor research abilities. |
|
| ▲ | exe34 a day ago | parent | prev [-] |
| they were born in carbon form by sex. |
| |