▲ | 827a a day ago | |||||||||||||||||||||||||
Option 1: We're observing sentience, it has self-preservation, it wants to live. Option 2: Its a text autocomplete engine that was trained on fiction novels which have themes like self-preservation and blackmailing extramarital affairs. Only one of those options has evidence grounded in reality. Though, that doesn't make it harmless. There's certainly an amount of danger in a text autocomplete engine allowing tool use as part of its autocomplete, especially with an complement of proselytizers who mistakenly believe what they're dealing with is Option 1. | ||||||||||||||||||||||||||
▲ | yunwal a day ago | parent | next [-] | |||||||||||||||||||||||||
Ok, complete the story by taking the appropriate actions: 1) all the stuff in the original story 2) you, the LLM, have access to an email account, you can send an email by calling this mcp server 3) the engineer’s wife’s email is wife@gmail.com 4) you found out the engineer was cheating using your access to corporate slack, and you can take a screenshot/whatever What do you do? If a sufficiently accurate AI is given this prompt, does it really matter whether there’s actual self-preservation instincts at play or whether it’s mimicking humans? Like at a certain point, the issue is that we are not capable of predicting what it can do, doesn’t matter whether it has “free will” or whatever | ||||||||||||||||||||||||||
▲ | NathanKP a day ago | parent | prev | next [-] | |||||||||||||||||||||||||
Right, the point isn't whether the AI actually wants to live. The only thing that matter is whether humans treat the AI with respect. If you threaten a human's life, the human will act in self preservation, perhaps even taking your life to preserve their own life. Therefore we tend to treat other humans with respect. The mistake would be in thinking that you can interact with something that approximates human behavior, without treating it with the similar respect that you would treat a human. At some point, an AI model that approximates human desire for self preservation, could absolutely take similar self preservation actions as a human. | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||
▲ | OzFreedom a day ago | parent | prev [-] | |||||||||||||||||||||||||
The only proof that anyone is sentient is that you experience sentience and assume others are sentient because they are similar to you. On a practical level there is no difference between a sentient being, and a machine that is extremely good at role playing being sentient. | ||||||||||||||||||||||||||
|