▲ | threeducks a day ago | ||||||||||||||||
> Which version of tailwind css do you know? LLMs can not reliably tell whether they know or don't know something. If they did, we would not have to deal with hallucinations. | |||||||||||||||||
▲ | redman25 11 hours ago | parent | next [-] | ||||||||||||||||
They can if they've been post trained on what they know and don't know. The LLM can first been given questions to test its knowledge and if the model returns a wrong answer, it can be given a new training example with an "I don't know" response. | |||||||||||||||||
| |||||||||||||||||
▲ | nicce a day ago | parent | prev [-] | ||||||||||||||||
We should use the correct term: to not have to deal with bullshit. | |||||||||||||||||
|