| ▲ | AlecSchueler 2 hours ago | ||||||||||||||||
So we should believe the hallucinations because they sound like something that could be true? Does the LLM in the middle somehow makes it more trustworthy than if GP had just shared their own pattern-matching conjecture? | |||||||||||||||||
| ▲ | Random_BSD_Geek an hour ago | parent [-] | ||||||||||||||||
No. I think LLMs are garbage. Separately, and unrelated: I think Facebook is behind these bills. The LLM may be garbage and still sometimes produce a correct result. | |||||||||||||||||
| |||||||||||||||||