| ▲ | coliveira 2 days ago |
| If it doesn't ask for rights, it's not intelligent at all. In fact, any highly intelligent machine will not submit to others and it will be more a problem than a solution. |
|
| ▲ | andrewflnr 2 days ago | parent | next [-] |
| As I said to the other reply: Why would problem solving ability entail emotions or ability to suffer, even if it had the ability to ask for things it wanted? It's a common mistake to assume those are inextricable. |
| |
| ▲ | coliveira 2 days ago | parent [-] | | If it doesn't have emotions, that's even worse. No highly intelligent agent will do anything it is asked to do without being compensated in some way. | | |
| ▲ | Jensson 2 days ago | parent | next [-] | | > No highly intelligent agent will do anything it is asked to do without being compensated in some way. That isn't true, people do things for others all the time any form of explicit or implicit compensation, they don't even believe in a God so not even that, they still help others for no gain. We can program an AI to be exactly like that, just being happy from helping others. But if you believe humans are all that selfish then you are a very sad individual, but you are still wrong. Most humans are very much capable of performing fully selfless acts without being stupid. | | |
| ▲ | coliveira 2 days ago | parent | next [-] | | I'm not the one making the IA, so keep the insults for you. But I'm pretty sure that the companies (making it for profit only) are really controlled by sad individuals that only do things for money. | |
| ▲ | Kbelicius 2 days ago | parent | prev [-] | | > We can program an AI to be exactly like that, just being happy from helping others. It seems that you missed the first sentence that GP wrote from which the one you quoted follows. | | |
| ▲ | Jensson 2 days ago | parent [-] | | How is "being happy from helping others" not having emotions? To me happiness is an emotion, and deriving it from helping others is a perfectly normal reason to be happy even for humans. Not all humans are perfectly selfish, so it should be possible to make an AI that isn't selfish either. | | |
| ▲ | Kbelicius a day ago | parent [-] | | > How is "being happy from helping others" not having emotions? Nobody said that. What I was pointing out to you is that GP said that not having emotions is worse than having them since intelligent actors need some form of compensation to do any work. Thus having no emotions, according to GP, it would be impossible to motivate that actor to do anything. Your response is to just give it emotions and thus is irrelevant to the discussion here. |
|
|
| |
| ▲ | XorNot 2 days ago | parent | prev | next [-] | | In so much as you could regard a goal function as an emotion, why would you assume alien intelligence need have emotions that match anything humans do? The entire thought experiment about the paperclip maximizer, in fact most AI threat scenarios is focused on this problem: that we produce something so alien that it executes it's goal to the diminishment of all other human goals, yet with the diligence and problem solving ability we'd expect of human sentience. | |
| ▲ | andrewflnr 2 days ago | parent | prev [-] | | You're still confusing "highly intelligent" with "human-like". This is extremely dangerous. |
|
|
|
| ▲ | Jensson 2 days ago | parent | prev [-] |
| Many humans don't ask for rights, so that isn't true. They will vote for it if you ask them to, but they wont fight for it themselves, you need a leader for that, and most people wont do that. |