| ▲ | IshKebab 5 days ago | |||||||
Nah they have definitely reduced massively. I suspect that's just because as models get more powerful their answers are just more likely to be true rather than hallucinations. I don't think anyone has found any new techniques to prevent them. But maybe we don't need that anyway if models just get so good that they naturally don't hallucinate much. | ||||||||
| ▲ | shaky-carrousel 5 days ago | parent | next [-] | |||||||
That's because they're harder to spot, not because there are less. In my field I still see the same amount. They're just not as egregious. | ||||||||
| ||||||||
| ▲ | bigstrat2003 5 days ago | parent | prev [-] | |||||||
They haven't reduced one bit in my experience. | ||||||||