Remix.run Logo
Retric 2 days ago

The general part of general intelligence. If they don’t think in those terms there’s an inherent limitation.

Now, something that’s arbitrarily close to AGI but doesn’t care about endlessly working on drudgery etc seems possible, but also a more difficult problem you’d need to be able to build AGI to create.

AstroBen 2 days ago | parent [-]

Artificial general intelligence (AGI) refers to the hypothetical intelligence of a machine that possesses the ability to understand or learn any intellectual task that a human being can. Generalization ability and Common Sense Knowledge [1]

If we go by this definition then there's no caring, or a noticing of drudgery? It's simply defined by its ability to generalize solving problems across domains. The narrow AI that we currently have certainly doesn't care about anything. It does what its programmed to do

So one day we figure out how to generalize the problem solving, and enable it to work on a million times harder things.. and suddenly there is sentience and suffering? I don't see it. It's still just a calculator

1- https://cloud.google.com/discover/what-is-artificial-general...

krupan 2 days ago | parent | next [-]

It's really hard to picture general intelligence that's useful that doesn't have any intrinsic motivation or initiative. My biggest complaint about LLMs right now is that they lack those things. They don't care even if they give you correct information or not and you have to prompt them for everything! That's not anything close to AGI. I don't know how you get to AGI without it developing preferences, self-motivation and initiative, and I don't know how you then get it to effectively do tasks that it doesn't like, tasks that don't line up with whatever motivates it.

2 days ago | parent [-]
[deleted]
Retric 2 days ago | parent | prev | next [-]

“ability to understand”

Isn’t just the ability to preform a task. One of the issues with current AI training is it’s really terrible at discovering which aspects of the training data are false and should be ignored. That requires all kinds of mental tasks to be constantly active including evaluating emotional context to figure out if someone is being deceptive etc.

AstroBen 2 days ago | parent [-]

> Isn’t just the ability to preform a task.

Right. In this case I'd say it's the ability to interpret data and use it to succeed at whatever goals it has

Evaluating emotional context would be similar to a chess engine calculating its next move. There's nothing there that implies a conscience, sentience, morals, feelings, suffering or anything 'human'. It's just a necessary intermediate function to achieve its goal

Rob miles has some really good videos on AI safety research which touches on how AGI would think. Thats shaped a lot of how I think about it https://www.youtube.com/watch?v=hEUO6pjwFOo

Retric 2 days ago | parent [-]

> Evaluating emotional context would be similar to a chess engine calculating its next move. There's nothing there that implies a conscience, sentience, morals, feelings, suffering or anything 'human'. It's just a necessary intermediate function to achieve its goal

If it’s limited to achieving goals it’s not AGI. Real time personal goal setting based on human equivalent emotions is an “intellectual task.” One of many requirements for AGI therefore is to A understand the world in real time and B emotionally respond to it. Aka AGI would by definition “necessitate having feelings.”

There’s philosophical arguments that there’s something inherently unique about humans here, but without some testable definition you could make the same argument that some arbitrary group of humans don’t have those qualities “gingers have no souls.” Or perhaps “dancing people have no consciousness” which seems like gibberish not because it’s a less defensible argument, but because you haven’t been exposed to it before.

AstroBen 2 days ago | parent [-]

I mean we just fundamentally have different definitions of AGI. Mine's based on outcomes and what it can do, so purely goal based. Not the processes that mimic humans or animals

I think this is the most likely first step of what would happen seeing as we're pushing for it to be created to solve real world problems

Retric a day ago | parent [-]

I’m not sure how you can argue something is a general intelligence if it can’t do those kinds of things? Comes out of the factory with a command: “Operate this android for a lifetime pretending to be human.”

Seems like arguing something is a self driving car if it needs a backup human driver for safety. It’s simply not what people who initially came up with the term meant and not what a plain language understanding of the term would suggest.

AstroBen a day ago | parent [-]

Because I see intelligence as the ability to produce effective actions towards a goal. A more intelligent chess AI beats a less intelligent one by making better moves towards the goal of winning the game

The G in AGI is being able to generalize that intelligence across domains, including those its never seen before, as a human could

So I would fully expect an advanced AGI to be able to pretend to be a human. It has a model of the world, knows how humans act, and could move the android in a human like manner, speak like a human, and learn the skills a human could

Is it conscious or feeling though? Or following the same processes that a human does? That's not necessary. Birds and planes both fly, but they're clearly different things. We (probably) don't need to simulate the brain to create this kind of intelligence

Lets pinch this AGI to test if it 'feels pain'

<Thinking>

Okay, I see that I have received a sharp pinch at 55,77,3 - the elbow region

My goal is to act like a human. In this situation a human would likely exhibit a pain response

A pain response for humans usually involves a facial expression and often a verbal acknowledgement

Humans normally respond quite slow, so I should wait 50ms to react

"Hey! Why did you do that? That hurt!"

...Is that thing human? I bet it'll convince most of the world it is.. and that's terrifying

Retric a day ago | parent [-]

> Is it conscious or feeling though?

You’re falling into the “Ginger’s don’t have souls” trap I just spoke of.

We don’t define humans as individuals components so your toe isn’t you, but by that same token your car isn’t you either. If some sub component of a system is emulating a human consciousness then we don’t need to talk about the larger system here.

AGI must be able to do these things, but it doesn’t need to have human mental architecture. Something that can simulate physics well enough could emulate an all the atomic scale interactions in a human brain for example. That virtual human brain would then experience everything we did even if the system running the simulation didn’t.

imtringued 2 days ago | parent | prev [-]

Exactly. It's called artificial general intelligence, not human general intelligence.

Retric a day ago | parent [-]

Something can’t “Operate this cat Android, pretending to be a cat.” if it can’t do what I described.

A single general intelligence needs to be able to fly an aircraft, get a degree, run a business, and raise a baby to adulthood just like a person or it’s not general.

9rx a day ago | parent [-]

So AGI is really about the hardware?

Retric a day ago | parent [-]

We’ve built hardware capable of those things if remotely controlled. It’s the thinking bits that are hard.

9rx a day ago | parent [-]

Only to the extent of having specialized bespoke solutions. We have hardware to fly a plane, but that same hardware isn't able to throw a mortarboard in the air after receiving its degree, and the hardware that can do that isn't able to lactate for a young child.

General intelligence is easy compared to general physicality. And, of course, if you keep the hardware specialized to make its creation more tractable, what do you need general intelligence for? Special intelligence that matches the special hardware will work just as well.

Retric a day ago | parent [-]

Flying an aircraft requires talking to air traffic control which existing systems can’t do. Though obviously not a huge issue when the aircraft already has radios, except all those FAA regulations apply to every single aircraft you’ve retrofitting.

The advantage of general intelligence is using a small set of hardware now lets you tackle a huge range of tasks or in the above aircraft types. We can mix speakers, eyes, and hands to do a vast array of tasks. Needing new hardware and software for every task very quickly becomes prohibitive.

9rx 9 hours ago | parent [-]

The advantage of general intelligence is that it can fly you home to the nearest airport, drive you the last mile, and, once home, cook you supper. But for that you need the hardware to be equally general.

If you need to retrofit airplanes and in such a way that the hardware is specific to flying, no need for general intelligence. Special intelligence will work just as well. Multimodal AI isn't AGI.

Retric 8 hours ago | parent [-]

No, the advantage of AGI isn’t being able to do all those physical things, the advantage of AGI is you don’t need to keep building new software for every task.

Let’s suppose you wanted to replace a pilot for a 747, now you need to be able fly, land, etc which we’re already capable of. However, actual job of a pilot goes well past just flying.

You also need to do the preflight such as verifying fuel is appropriately for the trip, check weather, alternate landing spots, preflight walk around the aircraft etc etc. It also needs to be able to keep up with any changing procedures as a special purpose softener you’re talking a multi billion dollar investment, or have an AGI run through the normal pilot training and certification process for a trivial fraction of those costs.

That’s the promise of AGI.

9rx 8 hours ago | parent [-]

> the advantage of AGI is you don’t need to keep building new software for every task.

Even the human's brain seems to be 'built' for its body. You're moving into ASI realm if the software can configure itself for the body automatically.

> That’s the promise of AGI.

That's the promise of multimodal AI. AGI requires general ability – meaning basically able to do anything humans can – which requires a body as capable as a human's body.

Retric 2 hours ago | parent [-]

Human brains aren’t limited to the standard human body plan. People born with an extra finger have no issues operating that finger just as well as people with the normal complement of fingers. Animal experiments have pushed this quite far.

If your AI has an issue because the robot has a different body plan, then no it’s not AGI. That doesn’t mean it needs to be able to watch every camera in a city at the same time, but you can use multiple AGI’s.