| ▲ | kelnos 3 days ago |
| > Am I the only one who feels that Claude Code is what they would have imagined basic AGI to be like 10 years ago? That wouldn't have occurred to me, to be honest. To me, AGI is Data from Star Trek. Or at the very least, Arnold Schwarzenegger's character from The Terminator. I'm not sure that I'd make sentience a hard requirement for AGI, but I think my general mental fantasy of AGI even includes sentience. Claude Code is amazing, but I would never mistake it for AGI. |
|
| ▲ | buu700 3 days ago | parent | next [-] |
| I would categorize sentient AGI as artificial consciousness[1], but I don't see an obvious reason AGI inherently must be conscious or sentient. (In terms of near-term economic value, non-sentient AGI seems like a more useful invention.) For me, AGI is an AI that I could assign an arbitrarily complex project, and given sufficient compute and permissions, it would succeed at the task as reliably as a competent C-suite human executive. For example, it could accept and execute on instructions to acquire real estate that matches certain requirements, request approvals from the purchasing and legal departments as required, handle government communication and filings as required, construct a widget factory on the property using a fleet of robots, and operate the factory on an ongoing basis while ensuring reliable widget deliveries to distribution partners. Current agentic coding certainly feels like magic, but it's still not that. 1: https://en.wikipedia.org/wiki/Artificial_consciousness |
| |
| ▲ | ACCount37 2 days ago | parent [-] | | "Consciousness" and "sentience" are terms mired in philosophical bullshit. We do not have an operational definition of either. We have no agreement on what either term really means, and we definitely don't have a test that could be administered to conclusively confirm or rule out "consciousness" or "sentience" in something inhuman. We don't even know for sure if all humans are conscious. What we really have is task specific performance metrics. This generation of AIs is already in the valley between "average human" and "human expert" on many tasks. And the performance of frontier systems keeps improving. | | |
| ▲ | amanaplanacanal 2 days ago | parent [-] | | "Consciousness" seems pretty obvious. The ability to experience qualia. I do it, you do it, my dog does it. I suspect all mammals do it, and I suspect birds do too. There is no evidence any computer program does anything like it. It's "intelligence" I can't define. | | |
| ▲ | ACCount37 2 days ago | parent [-] | | Oh, so simple. Go measure it then. The definition of "featherless biped" might have more practical merit, because you can at least check for feathers and count limbs touching the ground in a mostly reliable fashion. We have no way to "check for qualia" at all. For all we know, an ECU in a year 2002 Toyota Hilux has it, but 10% of all humans don't. | | |
| ▲ | amanaplanacanal 2 days ago | parent [-] | | Plenty of things are real that can't be measured, including many physical sensations and emotions. I won't say they are impossible to ever be measured, but we currently have no idea how. | | |
| ▲ | ACCount37 2 days ago | parent [-] | | If you can't measure it and can't compare it, then for all practical purposes, it does not exist. "Consciousness" might as well not be real. The only real and measurable thing is capabilities. | | |
| ▲ | amanaplanacanal 2 days ago | parent [-] | | Oof. Tell chronic pain patients that their pain doesn't exist. I guess depression doesn't exist either. Or love. |
|
|
|
|
|
|
|
| ▲ | adastra22 3 days ago | parent | prev [-] |
| I would love for you to define AGI in such a way as for that to make sense. I presuppose that you actually mean ASI as a starting point, and that is being charitable that it isn’t just pattern matching to questionable sci-fi. |