Remix.run Logo
fzeroracer 7 days ago

> Because the middleman is faster and practically never lies/hallucinates for simple queries

How do you KNOW it doesn't lie/hallucinate? In order to know that, you have to verify what it says. And in order to verify what it says, you need to check other outside sources, like Wikipedia. So what I'm saying is: Why bother wasting time with the middle man? 'Vague queries' can be distilled into simple keyword searches: If I want to know what a 'Tsunami' is I can simply just plug that keyword into a Wikipedia search and skim through the page or ctrl-f for the information I want instantly.

If you assume that it doesn't lie/hallucinate because it was right on previous requests then you fall into the exact trap that blows your foot off eventually, because sometimes it can and will hallucinate over even benign things.

rockemsockem 6 days ago | parent [-]

I feel like you're coming from a very strange place of both using advanced technology that saves you time and expands your personal knowledge base and at the same time saying that the more advanced technology that saves you even more time and expands your knowledge base further is useless and a time sink.

For most questions it is so much faster to validate a correct answer than to figure out the answer to begin with. Vague queries CANNOT be distilled to simple keyword searches when you don't know where to start without significant time investment. Ctrl-f relies on you and the article having the exact same preferred vocabulary for the exact same concepts.

I do not assume that LLMs don't lie or hallucinate, I start with the assumption that they will be wrong. Which for the record is the same assumption I take with both websites and human beings.