| Trust is much more nuanced than N% wrong. You have to consider circumstantial factors as well. ie who runs The NY Times, who gives them money, what was the reason they were wrong, even if they’re not wrong what information are they leaving out. The list goes on. No single metric can capture this effectively. Moreover, the more political a topic the more likely the author is trying to influence your thoughts (but not me I promise!). I forgot who, but a historian was asked why they wouldn’t cover civil war history, and responded with something to the affect of “there’s no way to do serious work there because it’s too political right now”. It’s also why things like calling your opponents dumb, etc is so harmful. Nobody can fully evaluate the truthfulness of your claims (due to time, intellect, etc) but if you signal “I don’t like you” they’re rightfully going to ignore you because you’re signaling you’re unlikely to be trustworthy. Trust is hard earned and easily lost. |
| > You have to consider circumstantial factors as well This, too, goes into the probability of something being right or wrong. But the problem I'm pointing out is an inconsistent epistemology. The same kind of test should be applied to any claim, and then they have to be compared. When people trust a random TikToker over the NYT, they're not applying the same test to both sides. > It’s also why things like calling your opponents dumb, etc is so harmful. People who don't try to have any remotely consistent mechanism for weighing the likelihood of one claim against a contradicting one are, by my definition, stupid. Whether it's helpful or harmful to call them stupid is a whole other question. |
| |
| ▲ | zahlman 3 days ago | parent [-] | | My experience has been that people who trust some form of alternative news over the NYT are not preferring "some random TikToker". And a lot of the time, that trust is specific to a topic, one which matters to them personally. If they cannot directly verify claims, they can at least observe ways in which their source resonates with personal experience. | | |
| ▲ | pron 3 days ago | parent [-] | | Yes, but their choice of whom to trust is wildly inconsistent. There is no consistent test of how they judge some claim more or less trustworthy against an opposite claim. Of course, none of us are fully consistent, but some are just extremely so. Call me naive, but I think education can help. | | |
| ▲ | zahlman 3 days ago | parent [-] | | > There is no consistent test of how they judge some claim more or less trustworthy against an opposite claim. From my experience, there absolutely is. It just isn't legible to you. | | |
| ▲ | pron 3 days ago | parent [-] | | Ok, so what is a consistent epistemology that would lead someone to (probably an American) to believe the following things: planes are safe, atoms and viruses are real, the world is a globe, vaccines cause autism, Tylenol causes autism, vitamins are helpful, mobile phones do not cause cancer, weather forecasts are usually more-or-less right, man-made climate change is not real, the government can control the weather, GPS is reliable, stimulus causes inflation but tariffs do not, immigration harms my personal economic opportunities but natural population growth does not, the Roman Empire was real but descriptions of its ethnic makeup or the reasons for its collapse are not, etc. etc.? (the content of each individual belief is less important than the composite whole where the scholarship of strangers is sometimes accepted and sometimes rejected in a way that isn't explained by, say, reputation or replication) The only thing I can come up with is that they do believe rigorous scholarship can arrive at answers, but sometimes those who do have the "real answers", lie to us for nefarious reasons. The problem with that is that this just moves the question elsewhere: how do you decide, in a non-arbitrary way, whether what you're being told is an intentional lie? (Never mind how you explain the mechanism of lying on a massive scale.) For example, an epistemology could say that if you can think of some motivation for a lie then it's probably a lie, except that this, too, is not applied consistently. Why would doctors lie to us more than mechanics or pilots? Anohter option could be, "I believe things I'm told by people who care about me." I can understand why someone who cares about me may not want to lie to me, but what is the mechanism by which caring about someone makes you know the truth? I'm sure that everyone has had the personal experience of caring about someone else, and still advising them incorrectly, so this, too, quickly runs into contradictions. | | |
| ▲ | zahlman 3 days ago | parent [-] | | > so what is a consistent epistemology that would lead someone to (probably an American) to believe the following things: First, show me a person who believes all of them. Then, try asking that person. You are trying to ask me to justify entire worldviews. That is far beyond the scope of a single HN post, and also blatantly off topic. | | |
| ▲ | pron 3 days ago | parent [-] | | I think the president of the United States believes all or nearly all of these things (or claims to). And I did ask such people such questions - for example, people who fly a lot yet and "chemtrails" are poisoning us - but their answers always ended up with some arbitrary choice that isn't appliled consistently. Pretty much, when forced to choose between claims A and B, they go by which of them they wish to be true, even if they would, in other situations, judge the process of arriving at one of the conclusions to be much stronger than the other. They're more than happy to explain to you that they trust vitamins because of modern scientific research, which they describe as fradulent when it comes to vaccines. Their epistemology is so flagrantly inconsistent that my only conclusion was that they're stupid. I'm not saying that's an innate character trait, and I think this could well be the result of a poor education. |
|
|
|
|
|
|