Remix.run Logo
ceejayoz 2 days ago

It’ll still make shit up.

nomel 2 days ago | parent [-]

It'll need to make less up, so still worth it.

recursive 2 days ago | parent [-]

It doesn't need to make up any.

nomel 2 days ago | parent | next [-]

It does, since that's a fundamental reality of the current architecture, that most everyone in AI is working to reduce.

If you don't want hallucinations, you can't use LLM, at the moment. People are using LLM, so having giving it data, to hallucinate less, is the only practical answer to the problem they have.

If you see another, that will work within the current system of search engines using AI, please propose it.

Don't take this as me defending anything. It's the reality of the current state of the tech, and the current state of search engines, which is the context of this thread. Pretending that search engines don't use LLM that hallucinate data doesn't help anyone.

As always, we work within the playground that google and bing give us, because that's the reality of the web.

pixl97 2 days ago | parent | prev | next [-]

Use a database if you want something that doesn't make things up, not a neural net.

hsbauauvhabzb 2 days ago | parent | next [-]

I didn’t choose to use a neural net, search engines which are arguably critical and essential infrastructure rug-pulled.

recursive 2 days ago | parent | prev [-]

I'm on your side. Good advice for everyone.

nomel 2 days ago | parent [-]

But completely irrelevant to this thread, unrelated to the reality of search engines, and does nothing to help the grandparent.

esafak 2 days ago | parent | prev [-]

Given how LLMs work, hallucinations still occur. If you don't want them to do so, give them the facts and tell them what (not) to extrapolate.

SteveNuts 2 days ago | parent | next [-]

How to draw an owl:

1. Start by drawing some circles.

2. Erase everything that isn't an owl, until your drawing resembles an owl.

recursive 2 days ago | parent | prev [-]

Simpler: if you don't want them to do so, don't engage the LLM.