It will not in fact always do what you ask it because it lacks any understanding, though the chat interface and prolix nature of LLMs does a good job at hiding that.