| ▲ | pron 3 days ago |
| The problem is that too many people just don't know how to weigh different probabilities of correctness against each other. The NYT is wrong 5% of the time - I'll believe this random person I just saw on TikTok because I've never heard of them ever being wrong; I've heard many stories about doctors being wrong - I'll listen to RFK; scientific models could be wrong, so I'll bet on climate change being not real etc. |
|
| ▲ | ajaisjsbbz 3 days ago | parent | next [-] |
| Trust is much more nuanced than N% wrong. You have to consider circumstantial factors as well. ie who runs The NY Times, who gives them money, what was the reason they were wrong, even if they’re not wrong what information are they leaving out. The list goes on. No single metric can capture this effectively. Moreover, the more political a topic the more likely the author is trying to influence your thoughts (but not me I promise!). I forgot who, but a historian was asked why they wouldn’t cover civil war history, and responded with something to the affect of “there’s no way to do serious work there because it’s too political right now”. It’s also why things like calling your opponents dumb, etc is so harmful. Nobody can fully evaluate the truthfulness of your claims (due to time, intellect, etc) but if you signal “I don’t like you” they’re rightfully going to ignore you because you’re signaling you’re unlikely to be trustworthy. Trust is hard earned and easily lost. |
| |
| ▲ | pron 3 days ago | parent [-] | | > You have to consider circumstantial factors as well This, too, goes into the probability of something being right or wrong. But the problem I'm pointing out is an inconsistent epistemology. The same kind of test should be applied to any claim, and then they have to be compared. When people trust a random TikToker over the NYT, they're not applying the same test to both sides. > It’s also why things like calling your opponents dumb, etc is so harmful. People who don't try to have any remotely consistent mechanism for weighing the likelihood of one claim against a contradicting one are, by my definition, stupid. Whether it's helpful or harmful to call them stupid is a whole other question. | | |
| ▲ | zahlman 3 days ago | parent [-] | | My experience has been that people who trust some form of alternative news over the NYT are not preferring "some random TikToker". And a lot of the time, that trust is specific to a topic, one which matters to them personally. If they cannot directly verify claims, they can at least observe ways in which their source resonates with personal experience. | | |
| ▲ | pron 3 days ago | parent [-] | | Yes, but their choice of whom to trust is wildly inconsistent. There is no consistent test of how they judge some claim more or less trustworthy against an opposite claim. Of course, none of us are fully consistent, but some are just extremely so. Call me naive, but I think education can help. | | |
| ▲ | zahlman 3 days ago | parent [-] | | > There is no consistent test of how they judge some claim more or less trustworthy against an opposite claim. From my experience, there absolutely is. It just isn't legible to you. | | |
| ▲ | pron 3 days ago | parent [-] | | Ok, so what is a consistent epistemology that would lead someone to (probably an American) to believe the following things: planes are safe, atoms and viruses are real, the world is a globe, vaccines cause autism, Tylenol causes autism, vitamins are helpful, mobile phones do not cause cancer, weather forecasts are usually more-or-less right, man-made climate change is not real, the government can control the weather, GPS is reliable, stimulus causes inflation but tariffs do not, immigration harms my personal economic opportunities but natural population growth does not, the Roman Empire was real but descriptions of its ethnic makeup or the reasons for its collapse are not, etc. etc.? (the content of each individual belief is less important than the composite whole where the scholarship of strangers is sometimes accepted and sometimes rejected in a way that isn't explained by, say, reputation or replication) The only thing I can come up with is that they do believe rigorous scholarship can arrive at answers, but sometimes those who do have the "real answers", lie to us for nefarious reasons. The problem with that is that this just moves the question elsewhere: how do you decide, in a non-arbitrary way, whether what you're being told is an intentional lie? (Never mind how you explain the mechanism of lying on a massive scale.) For example, an epistemology could say that if you can think of some motivation for a lie then it's probably a lie, except that this, too, is not applied consistently. Why would doctors lie to us more than mechanics or pilots? Anohter option could be, "I believe things I'm told by people who care about me." I can understand why someone who cares about me may not want to lie to me, but what is the mechanism by which caring about someone makes you know the truth? I'm sure that everyone has had the personal experience of caring about someone else, and still advising them incorrectly, so this, too, quickly runs into contradictions. | | |
| ▲ | zahlman 3 days ago | parent [-] | | > so what is a consistent epistemology that would lead someone to (probably an American) to believe the following things: First, show me a person who believes all of them. Then, try asking that person. You are trying to ask me to justify entire worldviews. That is far beyond the scope of a single HN post, and also blatantly off topic. | | |
| ▲ | pron 3 days ago | parent [-] | | I think the president of the United States believes all or nearly all of these things (or claims to). And I did ask such people such questions - for example, people who fly a lot yet and "chemtrails" are poisoning us - but their answers always ended up with some arbitrary choice that isn't appliled consistently. Pretty much, when forced to choose between claims A and B, they go by which of them they wish to be true, even if they would, in other situations, judge the process of arriving at one of the conclusions to be much stronger than the other. They're more than happy to explain to you that they trust vitamins because of modern scientific research, which they describe as fradulent when it comes to vaccines. Their epistemology is so flagrantly inconsistent that my only conclusion was that they're stupid. I'm not saying that's an innate character trait, and I think this could well be the result of a poor education. |
|
|
|
|
|
|
|
|
| ▲ | stocksinsmocks 3 days ago | parent | prev | next [-] |
| 5% wrong is an extremely charitable take on the NYT. I once went to a school that had complementary subscriptions. The first time I sat down to read one there was an article excoriating President Bush about hurricane Katrina. The entire article was a glib expansion of an expert opinion who was just some history teacher who said that it was “worse than the battle of Antietam” for America. No expertise in climate. No expertise in disaster response. No discussion of facts. “Area man says Bush sucks!” would have been just as intellectually rigorous. I put the paper back on the shelf and have never looked at one since. Don’t get emotionally attached to content farms. |
| |
| ▲ | jfengel 3 days ago | parent | next [-] | | That sounds like something from the opinion page rather than the news. That is ok, as long as it's clearly labeled. It doesn't sound particularly high quality; perhaps they were a local giving their view from the community. Regardless, clearly labeled opinions are standard practice in journalism. They're just not on the front page. If you saw that on the front page, then I'd need more context, because that is not common practice at NYT. | | |
| ▲ | stocksinsmocks 3 days ago | parent [-] | | It was on the front page, and, no, it wasn’t a labeled editorial. If you feel the need to research this to defend their honor, it would have been around fall 2005. I don’t assume their journalism has improved in the past 20 years, and I’m OK with not knowing. |
| |
| ▲ | wannadingo 3 days ago | parent | prev | next [-] | | So since incorporating in 1851, let's say they put out 60,000 issues. 1 issue would represent about 0.002% of their output. How do you get to over 5% wrong? | | |
| ▲ | saulpw 3 days ago | parent [-] | | It's a spot check. They checked one article from one of those issues and they spotted an error, so odds of >5% wrongness are high in their view. (They need a larger sample size and some statistics to make such a claim, but certainly your numbers are way off, but in the other direction.) |
| |
| ▲ | code_for_monkey 3 days ago | parent | prev [-] | | were defending bushs katrina response now? | | |
| ▲ | stocksinsmocks 3 days ago | parent [-] | | No, but I’m pretty sure a Civil War history teacher’s opinion isn’t the source you want. |
|
|
|
| ▲ | tsunamifury 3 days ago | parent | prev | next [-] |
| Once a problem demands second order thinking you immediately lose a significant portion of the population. It’s simply reality, or else propaganda wouldn’t work so well. |
| |
| ▲ | rglover 3 days ago | parent | next [-] | | This is easily one of the most valuable comments I've ever seen on HN. | |
| ▲ | pron 3 days ago | parent | prev [-] | | ... and since we now know the world is more complex than what we used to think, say, 1000 years ago, this kind of "second-order thinking" is required more and more. |
|
|
| ▲ | nsxwolf 3 days ago | parent | prev | next [-] |
| COVID ended my trust in media. I went from healthy skepticism to assuming everything is wrong/a lie. There was no accountability for this so this will never change for me. I am like the people who lived through the Great Depression not trusting banks 60 years later and keeping their money under the mattress. |
| |
| ▲ | ZeroGravitas 3 days ago | parent | next [-] | | I've seen this take a few times recently, including from a relatively famous person who seemed to be on my wavelength generally but I don't quite understand what is meant by it. Could you quickly summarize how and why you felt let down by the media in regards to COVID? | | |
| ▲ | phantasmish 3 days ago | parent | next [-] | | Seconding this, I somehow managed to avoid encountering the coverage of COVID that people say shook their faith in institutions, despite following the news pretty closely. Like to the point that if not for others' reactions it'd never have occurred to me to regard the coverage as notably bad (unlike, say, the lead-up to the war in Iraq). I'd love to know what people are talking about when they bring this up, because I truly have no idea. | |
| ▲ | nsxwolf 3 days ago | parent | prev [-] | | I actually kept a journal of all the things I was noticing that were becoming memory holed almost day to day, just so I wouldn’t think I was going crazy. I’m not really interested in re-litigating any of it with anyone, though. Nothing good ever comes from that. |
| |
| ▲ | pron 3 days ago | parent | prev [-] | | So the position of a sceptic is epistemologically valid: you distrust any claim that is under, say, 95% certainty. But this bar should be applied consistently, and sometimes you have to bet. For example, in the question of getting a vaccine or not, you must choose, and you should choose whatever claim is more likely to get a better result than the other. The key is that distrusting one side or source does not logically entail trusting another source more. If you think that the media or medical establishment is wrong, say, 45% of the time, you still have to find a source of information that is only wrong 40% of the time to prefer it. |
|
|
| ▲ | mvdtnz 2 days ago | parent | prev [-] |
| The problem isn't "The NYT is wrong 5% of the time". It's that institutions are systematically wrong in predictable ways that happen to benefit their point of view. It's not random. It's planned. |
| |
| ▲ | pron 2 days ago | parent [-] | | But that source X is wrong - intentionally or not, in a biased way or not - does not entail that you should trust source Y. That just doesn't follow. To prefer an alternative source you must find one that is more trustworthy. The problem is that often we have to choose because decisions are binary: either we get a vaccine or not. For example, to decide not to get a vaccine, the belief that the medical establishment are lying liars is just not enough. We must also believe that the anti-vaxxers are more knowledgeable and trustworthy than the medical establishment. Doctors could be lying 60% of the time and still be more likely to be right than, say, RFK. It's not enough to only look at one side; we have to compare two claims against each other. For the best outcome, you have to believe someone who's wrong 80% of the time over someone who's wrong 90% of the time. Even if you believe in a systemic, non-random bias, that doesn't help you unless you have a more reliable source of information. And this is exactly the inconsistent epistemology that we see all around us: People reject one source of information by some metric they devise for themselves and then accept another source that fails on that metric even more. |
|