Remix.run Logo
HarHarVeryFunny 3 days ago

> The key thing is to develop an intuition for questions it can usefully answer vs questions that are at a level of detail where the lossiness matters

It's also useful to have an intuition for what things an LLM is liable to get wrong/hallucinate, one of which is questions where the question itself suggests one or more obvious answers (which may or may not be correct), which the LLM may well then hallucinate, and sound reasonable, if it doesn't "know".

felipeerias 3 days ago | parent [-]

LLMs are very sensitive to leading questions. A small hint of that the expected answer looks like will tend to produce exactly that answer.

giantrobot 3 days ago | parent | next [-]

You don't even need a leading direct question. You can easily lead an LLM just by having some statements (even at times single words) in the context window.

SAI_Peregrinus 3 days ago | parent | prev [-]

As a consequence LLMs are extremely unlikely to recognize an X-Y problem.