| ▲ | bossyTeacher a day ago | |
The problem isn't whether they have more or less hallucinations. The problem is that they have them. And as long as they hallucinate, you have to deal with that. It doesn't really matter how you prompt, you can't prevent hallucinations from happening and without manual checking, eventually hallucinations will slip under the radar because the only difference between a real pattern and a hallucinated one is that one exists in the world and the other one doesn't. This is not something you can really counter with more LLMs either as it is a problem intrinsic to LLMs | ||
| ▲ | SubiculumCode 8 hours ago | parent [-] | |
Humans also hallucinate. We have an error rate. Your argument makes little sense in absolutist terms. | ||