▲ | f_devd 3 days ago | |||||||
Ideally yes, LLMs are tools that we expect to work, people are inherently fallible and (even unintentionally) deceptive. LLMs being human-like in this specific way is not desirable. | ||||||||
▲ | stavros 3 days ago | parent [-] | |||||||
Then I think you'll be very disappointed. LLMs aren't in the same category as calculators, for example. | ||||||||
|