| ▲ | alyxya 10 days ago | |||||||
I try to make my best judgment regarding what to trust. It isn’t guaranteed that content written by humans is necessarily correct either. The nice thing about ChatGPT is that I can ask for sources, and sometimes I can rely on that source to fact check. | ||||||||
| ▲ | latexr 10 days ago | parent | next [-] | |||||||
> The nice thing about ChatGPT is that I can ask for sources And it will make them up just like it does everything else. You can’t trust those either. In fact, one of the simplest ways to find out a post is AI slop is by checking the sources posted at the end and seeing they don’t exist. Asking for sources isn’t a magical incantation that suddenly makes things true. > It isn’t guaranteed that content written by humans is necessarily correct either. This is a poor argument. The overwhelming difference with humans is that you learn who you can trust about what. With LLMs, you can never reach that level. | ||||||||
| ||||||||
| ▲ | MangoToupe 10 days ago | parent | prev [-] | |||||||
Sure, but a chatbot will compound the inaccuracy. | ||||||||