▲ | pram 4 days ago | |||||||
It does citations (Grok and Claude etc do too) but I've found when I read the source on some stuff (GitHub discussions and so on) it sometimes actually has nothing to do with what the LLM said. I've actually wasted a lot of time trying to find the actual spot in a threaded conversation where the example was supposedly stated. | ||||||||
▲ | sarchertech 4 days ago | parent [-] | |||||||
Same experience with Google search AI. The links frequently don’t support the assertions, they’ll just say something that might show up in a google search for the assertion. For example if I’m asking about whether a feature exists in some library, the AI says yes it does and links to a forum where someone is asking the same question I did, but no one answered (this has happened multiple times). | ||||||||
|