▲ | gmac 3 days ago | |
Not really: it's arguably quite a lot worse. Because you can judge the trustworthiness of the source when you follow a link from Google (e.g. I will place quite a lot of faith in pages at an .nhs.uk URL), but nobody knows exactly how that specific LLM response got generated. | ||
▲ | naasking 2 days ago | parent [-] | |
Many of the big LLMs do RAG and will provide links to sources, eg. Bing/ChatGPT, Gemini Pro 2.5, etc. |