Remix.run Logo
grey-area 10 hours ago

I really don’t think that’s doable because why do you the majority output is correct? It’s just as likely to be a hallucination.

If he problem is the system has no concept of correctness or world model.

disgruntledphd2 8 hours ago | parent [-]

Assuming that hallucinationd are relatively random it's true. I do believe that they happen less often when you feed the model decent context though.