Another point — in my experience, LLMs and humans tend to fail in different ways, meaning that a human is likely to catch an LLM's failure.