| ▲ | catgary 3 days ago | |||||||||||||
Are you implying that an LLM needs to be trained on a specific piece of text to answer questions about it? | ||||||||||||||
| ▲ | johnnyanmac 3 days ago | parent [-] | |||||||||||||
If you want proper answers, yes. If you want to rely on whatever reddit or tiktok says about the book, then I guess at that point you're fine with hallucinations and others doing the thinking for you anyway. Hence the issues brought up in the article. I wouldn't trust an LLM for anything more than the most basic questions of it didn't actually have text to cite. | ||||||||||||||
| ||||||||||||||