| ▲ | idiotsecant a day ago | |||||||||||||||||||||||||||||||||||||
The is the start of what I always thought an AI should have - a limbic system. Humans don't store memory based on novelty, they store it based on emotional content. This is where I was afraid of the tiger, this is where I smelled delicious food, this was what it felt like when I was victorious in the hunt. AI needs an internal emotional state because that's what drives attention and memory. AI needs to want something. | ||||||||||||||||||||||||||||||||||||||
| ▲ | luckydata a day ago | parent | next [-] | |||||||||||||||||||||||||||||||||||||
That would be the biggest mistake anyone could do. I hope nobody goes down this route. AI "wanting" things are an enormous risk to alignment. | ||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||
| ▲ | red75prime 11 hours ago | parent | prev [-] | |||||||||||||||||||||||||||||||||||||
...this is where I randomly decided to remember this particular day of my life. Yep, I indeed did it because why not. No, it didn't work particularly well, but I do remember some things about that day. I mean it's not just automatic thing with no higher-level control. | ||||||||||||||||||||||||||||||||||||||