| ▲ | idiotsecant a day ago | |||||||||||||
At some point I think we'll have to face the idea that any AI more intelligent than ourselves will by definition be able to evade our alignment tricks. | ||||||||||||||
| ▲ | luckydata a day ago | parent [-] | |||||||||||||
equating more intelligent to "wanting things" is a fallacy. You can have a hyper intelligent computer that simply waits for you to ask it to do a job, or you can endow it with the digital equivalent of hunger and reproductive instincts and it will behave completely differently. We would be INSANE to pursue giving that type of instincts to AIs. | ||||||||||||||
| ||||||||||||||