| ▲ | Spooky23 a day ago |
| If this tech is empowered to make decisions, it needs to prevented from drawing those conclusions, as we know how organic intelligence behaves when these conclusions get reached. Killing people you dislike is a simple concept that’s easy to train. We need an Asimov style laws of robotics. |
|
| ▲ | a day ago | parent | next [-] |
| [deleted] |
|
| ▲ | seanhunter a day ago | parent | prev | next [-] |
| That's true of all technology. We put a guard on chainsaws. We put robotic machining tools into a box so they don't accidentally kill the person who's operating them. I find it very strange that we're talking as though this is somehow meaningfully different. |
| |
| ▲ | Spooky23 12 hours ago | parent [-] | | It’s different because you have a decision engine that is generally available. The blade guard protects the user from inattention… not the same as an autonomous chainsaw that mistakes my son for a tree. Scaled up, technology like guided missiles is locked up behind military classification. The technology is now generally available to replicate many of the use cases of those weapons, assessable to anyone with a credit card. Discussions about security here often refer to Thompson’s “Reflections on Trusting Trust”. He was reflecting on compromising compilers — compilers have moved up the stack and are replacing the programmer. As the required skill level of a “programmer” drops, you’re going to have to worry about more crazy scenarios. |
|
|
| ▲ | eru a day ago | parent | prev [-] |
| > We need an Asimov style laws of robotics. The laws are 'easy', implementing them is hard. |
| |
| ▲ | chuckadams a day ago | parent [-] | | Indeed, I, Robot is made up entirely of stories in which the Laws of Robotics break down. Starting from a mindless mechanical loop of oscillating between one law's priority and another, to a future where they paternalistically enslave all humanity in order to not allow them to come to harm (sorry for the spoilers). As for what Asimov thought of the wisdom of the laws, he replied that they were just hooks for telling "shaggy dog stories" as he put it. |
|