Remix.run Logo
madeofpalk 18 hours ago

It’s apt, because the only thing LLMs is hallucinate because they have no grounding in reality. They take your input and hallucinate to do something “useful” with it.