| ▲ | rvnx 8 hours ago | |||||||
LLMs have their issues too. In everyday life, you cannot read 20 books about a topic about everything you are curious about, but you can ask 5 subject-experts (“the LLMs”) in 20 seconds some of them who are going to check on some news websites (most are also biased) Then you can ask for summaries of pros and cons, and make your own opinions. Are they hallucinating ? Could be. Are they lying ? Could be. Have they been trained on what their masters said to say ? Could be. But multiplying the amount of LLMs reduce the risk. For example, if you ask DeepSeek, Gemini, Grok, Claude, GLM-4.7 or some models that have no guardrails, what they think about XXX, then perhaps there are interesting insights. | ||||||||
| ▲ | jamespo 7 hours ago | parent [-] | |||||||
This may shock you, but wikipedia provides multiple sources, it even links to them. Where do you think the LLMs are getting their data from? | ||||||||
| ||||||||