Remix.run Logo
rockemsockem 7 days ago

See my previous statement

> Once you have several fundamental terms and phrases for new topics it's easy to then validate the information with some quick googling too.

You're practically saying that looking at an index in the back of a book is a meaningless step.

It is significantly faster, so much so that I am able to ask it things that would have taken an indeterminate amount of time to research before, for just simple information, not deep understanding.

Edit:

Also I can truly validate literally any piece of information it gives me. Like I said previously, it makes it very easy to validate via Wikipedia or other places with the right terms, which I may not have known ahead of time.

fzeroracer 7 days ago | parent [-]

Again, why would you just not use Wikipedia as your index? I'm saying why would you use the index that lies and hallucinates to you instead of another perfectly good index elsewhere.

You're using the machine that ingests and regurgitates stuff like Wikipedia to you. Why not skip the middleman entirely?

rockemsockem 7 days ago | parent [-]

Because the middleman is faster and practically never lies/hallucinates for simple queries, the middleman can handle vague queries that Google and Wikipedia cannot.

The same reasons you use Wikipedia instead of reading all the citations on Wikipedia.

fzeroracer 7 days ago | parent [-]

> Because the middleman is faster and practically never lies/hallucinates for simple queries

How do you KNOW it doesn't lie/hallucinate? In order to know that, you have to verify what it says. And in order to verify what it says, you need to check other outside sources, like Wikipedia. So what I'm saying is: Why bother wasting time with the middle man? 'Vague queries' can be distilled into simple keyword searches: If I want to know what a 'Tsunami' is I can simply just plug that keyword into a Wikipedia search and skim through the page or ctrl-f for the information I want instantly.

If you assume that it doesn't lie/hallucinate because it was right on previous requests then you fall into the exact trap that blows your foot off eventually, because sometimes it can and will hallucinate over even benign things.

rockemsockem 6 days ago | parent [-]

I feel like you're coming from a very strange place of both using advanced technology that saves you time and expands your personal knowledge base and at the same time saying that the more advanced technology that saves you even more time and expands your knowledge base further is useless and a time sink.

For most questions it is so much faster to validate a correct answer than to figure out the answer to begin with. Vague queries CANNOT be distilled to simple keyword searches when you don't know where to start without significant time investment. Ctrl-f relies on you and the article having the exact same preferred vocabulary for the exact same concepts.

I do not assume that LLMs don't lie or hallucinate, I start with the assumption that they will be wrong. Which for the record is the same assumption I take with both websites and human beings.