Remix.run Logo
alpaca128 21 hours ago

There are use-cases where hallucinations simply do not matter. My favorite is finding the correct term for a concept you don't know the name of. Googling is extremely bad at this as search results will often be wrong unless you happen to use the commonly accepted term, but an LLM can be surprisingly good at giving you a whole list of fitting names just based on a description. Same with movie titles etc. If it hallucinates you'll find out immediately as the answer can be checked in seconds.

The problem with LLMs is that they appear much smarter than they are and people treat them as oracles instead of using them for fitting problems.

skydhash 20 hours ago | parent [-]

Maybe I read too much encyclopedia, but my current workflow is to explore introductory material. Like open a database textbook and you'll find all the jargon there. Curated collection can get you there too.

Books are a nice example of this, where we have both the table of contents for a general to particular concepts navigation, and the index for keyword based navigation.

fao_ 4 hours ago | parent [-]

Right! The majority of any 101 book will be enough to understand the jargon, but the above poster's comment looks past the fact that often knowing what term to use isn't enough, it's knowing the context and usage around it too. And who's to know the AI isn't bullshitting you about all or any of that. If you're learning the information, then you don't know enough to discern negatively-valued information from any other kind.