| ▲ | gf000 3 hours ago | |
> the fact that this goes viral just goes to show how rare it is No, it shows that it is trivial to reproduce and people get a nice, easy to process reminder that LLMs are not omnipotent. Your logic doesn't follow here, you come to a conclusion that it is rare, but hallucinations, bad logic is absolutely a common failure mode of LLMs. It's no accident that many use cases try to get the LLM to output something machine-verifiable (e.g. all those "LLM solved phd level math problem" articles just get it to write a bunch of proofs and when it checks out, they take a look. So it's more of a "statistical answer generator" that may contain a correct solution next to a bunch of bullshit replies - and one should be aware of that) | ||