| ▲ | hsuduebc2 a day ago | ||||||||||||||||||||||
Yeah, it’s a very powerful tool, and it needs to be used carefully and with intent. People on Hacker News mostly get that already, but for ordinary users it’s a full-on paradigm shift. It moved from: A very precise source of information, where the hardest part was finding the right information. To: Something that can produce answers on demand, where the hardest part is validating that information, and knowing when to doubt the answer and force it to recheck the sources. This happened in a year or two so I can't really blame. The truth machine where you doesn't needed to focus too much on validating answers changed rapidly to slop machine where ironically, your focus is much more important. | |||||||||||||||||||||||
| ▲ | JumpCrisscross 13 hours ago | parent | next [-] | ||||||||||||||||||||||
> People on Hacker News mostly get that already It’s super easy to stop fact checking these AIs and just trust they’re reading the sources correctly. I caught myself doing it, went back and fact checked past conversations, and lo and behold in two cases shit was made up. These models are built to engage. They’re going to reinforce your biases, even without evidence, because that’s flattering and triggers a dopamine hit. | |||||||||||||||||||||||
| ▲ | SecretDreams a day ago | parent | prev [-] | ||||||||||||||||||||||
> This happened in a year or two so I can't really blame. The truth machine where you doesn't needed to focus too much on validating answers changed rapidly to slop machine where ironically, your focus is much more important. Very much this for the general public. I view it as borderline* dangerous to anyone looking for confirmation bias. | |||||||||||||||||||||||
| |||||||||||||||||||||||