| ▲ | criley2 5 days ago | |
Advanced reasoning LLM's simulate many parts of AGI and feel really smart, but fall short in many critical ways. - An AGI wouldn't hallucinate, it would be consistent, reliable and aware of its own limitations - An AGI wouldn't need extensive re-training, human reinforced training, model updates. It would be capable of true self-learning / self-training in real time. - An AGI would demonstrate real genuine understanding and mental modeling, not pattern matching over correlations - It would demonstrate agency and motivation, not be purely reactive to prompting - It would have persistent integrated memory. LLM's are stateless and driven by the current context. - It should even demonstrate consciousness. And more. I agree that what've we've designed is truly impressive and simulates intelligence at a really high level. But true AGI is far more advanced. | ||
| ▲ | waffletower 5 days ago | parent | next [-] | |
Humans can fail at some of these qualifications, often without guile: - being consistent and knowing their limitations - people do not universally demonstrate effective understanding and mental modeling. I don't believe the "consciousness" qualification is at all appropriate, as I would argue that it is a projection of the human machine's experience onto an entirely different machine with a substantially different existential topology -- relationship to time and sensorium. I don't think artificial general intelligence is a binary label which is applied if a machine rigidly simulates human agency, memory, and sensing. | ||
| ▲ | versteegen 4 days ago | parent | prev | next [-] | |
> - It should even demonstrate consciousness. I disagreed with most of your assertions even before I hit the last point. This is just about the most extreme thing you could ask for. I think very few AI researchers would agree with this definition of AGI. | ||
| ▲ | lysace 5 days ago | parent | prev [-] | |
Thanks for humoring my stupid question with a great answer. I was kind of hoping for something like this :). | ||