Remix.run Logo
benedictevans 4 days ago

I tried to capture this on the last slide before the conclusion - maybe all AI questions have one of two answers - "no-one knows" or "it will be the same as the last time"

this is one of the "no-one knows" questions

Animats 4 days ago | parent [-]

The question I'm asking isn't whether hallucinations can be fixed. It's what, if they are not fixed, are the economic consequences for the industry? How necessary is it that LLMs become trustworthy? How much valuation assumes that they will?

Sateeshm a day ago | parent [-]

And is it even fixable?

namaria 10 hours ago | parent [-]

The "hallucinations" problem feels to me like an inherent feature. For LLMs to have interesting output the temperature needs to be higher then zero. The whole system is interesting because it is probabilistic. "Hallucinations" (hate the word btw) are to LLMs as melting is to ice. There will be no 'meltless' ice because the melting is what makes it cold and useful.