But the LLM is going to do what its prompt (system prompt + user prompts) says. A human being can reject a task (even if that means losing their life).
LLMs cannot do other thing than following the combination of prompts that they are given.