Remix.run Logo
archaeans 6 days ago

And the often missed caveat is that we should only care about whether the software does what it is supposed to do.

Under that light, LLMs are just buggy and have been for years. Where is the LLM that does what it says it should do? "Hallucination" and "do they reason" are distractions. They fail. They're buggy.