Remix.run Logo
pessimizer 2 days ago

AI constantly fabricates references. I found a question recently (accidentally) which would make it fabricate every answer. Otherwise, it tends to only fabricate about a third of them, and somehow miss dozens of others that it should easily find.

After asking for recommendations, I always immediately ask it if any are hallucinations. It then tells me a bunch of them are, then goes "Would you like more information about how LLMs "hallucinate," and for us to come up with an architecture that could reduce or eliminate the problem?" No, fake dude, I just want real books instead of imaginary ones, not to hear about your problems.

detail: The question was to find a book that examined how the work of a short order cook is done in detail, or any book with a section covering this. I started the question by mentioning that I already had Fast Foods and Short Order Cooking(1984) by Popper et al. and that was the best I had found so far.

It gave me about a half dozen great hallucinations. You can try the question and see how it works for you. They're so dumb. Our economy is screwed.