| ▲ | homebrewer 16 hours ago | |||||||||||||||||||||||||
And this explanation is very likely to be entirely hallucinated, or worse, subtly wrong in ways that's not obvious if you're not already well versed in the subject. So if you care about the truth even a little bit, you then have to go and recheck everything it has "said". Why waste time and energy on the lying machine in the first place? Just yesterday I asked "PhD-level intelligence" for a well known quote from a famous person because I wasn't able to find it quickly in wikiquotes. It fabricated three different quotes in a row, none of them right. One of them was supposedly from a book that doesn't really exist. So I resorted to a google search and found what I needed in less time it took to fight that thing. | ||||||||||||||||||||||||||
| ▲ | CamperBob2 16 hours ago | parent [-] | |||||||||||||||||||||||||
And this explanation is very likely to be entirely hallucinated, or worse, subtly wrong in ways that's not obvious if you're not already well versed in the subject. So if you care about the truth even a little bit, you then have to go and recheck everything it has "said". It cited its sources, which is certainly more than you've done. Just yesterday I asked "PhD-level intelligence" for a well known quote from a famous person because I wasn't able to find it quickly in wikiquotes. In my experience this means that you typed a poorly-formed question into the free instant version of ChatGPT, got an answer worthy of the effort you put into it, and drew a sweeping conclusion that you will now stand by for the next 2-3 years until cognitive dissonance finally catches up with you. But now I'm the one who's making stuff up, I guess. | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||