Remix.run Logo
yreg 3 days ago

I find it useful to underline the intrinsic properties of LLMs. When an LLM makes up something untrue, it's not a 'bug'.

I think that thinking of all LLM output as 'hallucinations' while making use of the fact that these hallucinations are often true for the real world is a good mindset, especially for nontechnical people, who might otherwise not realise.