LLMs are usually not aware of their true capabilities, so the answers you get back have a high probability of being hallucinated.