Remix.run Logo
idiotsecant a day ago

The is the start of what I always thought an AI should have - a limbic system. Humans don't store memory based on novelty, they store it based on emotional content. This is where I was afraid of the tiger, this is where I smelled delicious food, this was what it felt like when I was victorious in the hunt.

AI needs an internal emotional state because that's what drives attention and memory. AI needs to want something.

luckydata a day ago | parent | next [-]

That would be the biggest mistake anyone could do. I hope nobody goes down this route. AI "wanting" things are an enormous risk to alignment.

pixl97 a day ago | parent | next [-]

I mean setting any neural net with a 'goal' is really just defining a want/need. You can't just encode the entire problemspace of reality, you have to give the application something to filter out.

idiotsecant a day ago | parent | prev [-]

At some point I think we'll have to face the idea that any AI more intelligent than ourselves will by definition be able to evade our alignment tricks.

luckydata a day ago | parent [-]

equating more intelligent to "wanting things" is a fallacy. You can have a hyper intelligent computer that simply waits for you to ask it to do a job, or you can endow it with the digital equivalent of hunger and reproductive instincts and it will behave completely differently.

We would be INSANE to pursue giving that type of instincts to AIs.

drdeca 18 hours ago | parent | next [-]

For some senses of “wanting things”, I think it might be hard to make a powerful AI that couldn’t be easily modified to produce one that “wants things” in some sense.

So, if it would be bad thing for one to be made that “wants things” in any reasonable sense of the phrase, then it would probably be bad for J Random to be able to take a copy of a powerful AI and modify it in some way, because someone is likely to try doing that.

Of course, perhaps the best way to make sure that J Random doesn’t have the ability to do that, is to make sure no one does.

sayamqazi 14 hours ago | parent | prev [-]

You are making a claim that "Intelligenece" is separable from other things found in humans and other animals. There is no proof or example supporting this.

I have come to beleive that we will only be able to truly replicate intelligence if the system was trying to preserve itself. Its the biggest incentive ever to do intelligent things.

red75prime 11 hours ago | parent | prev [-]

...this is where I randomly decided to remember this particular day of my life. Yep, I indeed did it because why not. No, it didn't work particularly well, but I do remember some things about that day.

I mean it's not just automatic thing with no higher-level control.