▲ | Animats 4 days ago | |||||||
The question I'm asking isn't whether hallucinations can be fixed. It's what, if they are not fixed, are the economic consequences for the industry? How necessary is it that LLMs become trustworthy? How much valuation assumes that they will? | ||||||||
▲ | Sateeshm a day ago | parent [-] | |||||||
And is it even fixable? | ||||||||
|