| ▲ | johnnyanmac 3 days ago | |
If you want proper answers, yes. If you want to rely on whatever reddit or tiktok says about the book, then I guess at that point you're fine with hallucinations and others doing the thinking for you anyway. Hence the issues brought up in the article. I wouldn't trust an LLM for anything more than the most basic questions of it didn't actually have text to cite. | ||
| ▲ | catgary 3 days ago | parent | next [-] | |
Luckily, the LLM has the text to cite, it can be passed in at inference time, which is legally distinct from training on the data. | ||
| ▲ | terafo 3 days ago | parent | prev [-] | |
Having access to the text and being trained on the text are two different things. | ||