| ▲ | Paracompact 11 hours ago | ||||||||||||||||
Most of the popular discourse around AI is still at the level of, "Don't trust the AI, trust the sources!" When it gets to the point where even the sources of simple facts are untrustworthy, the average person just trying to learn some trivia about the world is doomed. Doesn't help that AI media literacy is so primitive compared to how intelligent the models are generally. We're in a marginally better place than we were back when chatbots didn't cite anything at all, but duplicated Wikipedia citations back to a single source about a supposedly global event is just embarrassing. By default, I feel citations and epistemological qualifications should be explicit, front-and-center, and subject to introspection, not implicit and confined to tiny little opaque buttons as an afterthought. | |||||||||||||||||
| ▲ | amiga386 11 hours ago | parent [-] | ||||||||||||||||
Wikipedia calls this https://en.wikipedia.org/wiki/Citogenesis (after XKCD coined it). You can expect the spicy autocomplete to feed you flattering bullshit. It may cite Wikipedia (it shouldn't), but you should go check out those citations, and validate the claims yourself. It's the least you can do. And if the cited source is Wikipedia... check Wikipedia's sources too. Wikipedians try their best to provide you with reliable sources for the claims in their articles (oh who am I trying to kid? They pick their favourite sources that affirm their beliefs, and contending editors remove them for no good reason, and eventually the only thing that accrues is things that the factions agree on, or at least what ArbCom has demanded they stop fighting over). I guess what I'm trying to say is: don't rely on that authoritative-sounding tone that Wikipedia uses (or that AI bots use, or that I'm using right now). It's a rhetorical trick that short-circuits your reasoning. Verify claims with care. Also check the Talk page, you often find all kinds of shenanigans called out there. | |||||||||||||||||
| |||||||||||||||||