Remix.run Logo
f_devd 3 days ago

Ideally yes, LLMs are tools that we expect to work, people are inherently fallible and (even unintentionally) deceptive. LLMs being human-like in this specific way is not desirable.

stavros 3 days ago | parent [-]

Then I think you'll be very disappointed. LLMs aren't in the same category as calculators, for example.

f_devd 3 days ago | parent [-]

I have no illusions on LLMs, I have been working with them since og BERT, always with these same issues and more. I'm just stating what would be needed in the future to make them reliably useful outside of creative writing & (human-guided & checked) search.

If an LLM provides an incorrect/orthogonal rhetoric without a way to reliably fix/debug it it's just not as useful as it theoretically could be given the data contained in the parameters.