▲ | AnimalMuppet 2 days ago | |
Isn't that one of the measures of when it becomes an AGI? So that doesn't help you with however many nines we are away from getting an AGI. Even if you don't like that definition, you still have the question of how many nines we are away from having an AI that can contribute to its own development. I don't think you know the answer to that. And therefore I think your "fast acceleration within two years" is unsupported, just wishful thinking. If you've got actual evidence, I would like to hear it. | ||
▲ | ben_w 2 days ago | parent | next [-] | |
AI has been helping with the development of AI ever since at least the first optimising compiler or formal logic circuit verification program. Machine learning has been helping with the development of machine learning ever since hyper-parameter optimisers became a thing. Transformers have been helping with the development of transformer models… I don't know exactly, but it was before ChatGPT came out. None of the initials in AGI are booleans. But I do agree that: > "fast acceleration within two years" is unsupported, just wishful thinking Nobody has any strong evidence of how close "it" is, or even a really good shared model of what "it" even is. | ||
▲ | scragz 2 days ago | parent | prev [-] | |
AGI is when it is general. a narrow AI trained only on coding and training AIs would contribute to the acceleration without being AGI itself. |