Remix.run Logo
yongjik 10 hours ago

I feel like we're living in different worlds, because from what I've seen, giving people platforms clearly doesn't work either. It just lets the most stupid and incendiary ideas to spread unchecked.

If you allow crazy people to "let it ride" then they don't stop until... until... hell we're still in the middle of it and I don't even know when or if they will stop.

atmavatar 10 hours ago | parent | next [-]

I wonder how much of that is giving a platform to conspiracy theorists and how much of it is the social media algorithms' manipulation making the conspiracy theories significantly more visible and persuasive.

prawn 7 hours ago | parent [-]

Is there any consideration of this with regard to Section 230? e.g., you're a passive conduit if you allow something to go online, but you're an active publisher if you actively employ any form of algorithm to publish and promote?

mac-attack 10 hours ago | parent | prev [-]

It's poorly thought out logic. Everyone sees how messy and how mistakes can be made when attempting to get to a truth backed by data + science, so they somehow they conclude that allowing misinformation to flourish will solve the problem instead of leading to a slow decline of morality/civilization.

Very analogous to people who don't like how inefficient governments function and somehow conclude that the solution is to put people in power with zero experience managing government.

mitthrowaway2 9 hours ago | parent [-]

There's a journey that every hypothesis makes on the route to becoming "information", and that journey doesn't start at top-down official recognition. Ideas have to circulate, get evaluated and rejected and accepted by different groups, and eventually grasp their way towards consensus.

I don't believe Trump's or Kennedy's ideas about COVID and medicine are the ones that deserve to win out, but I do think that top-down suppression of ideas can be very harmful to truth seeking and was harmful during the pandemic. In North America I believe this led to a delayed (and ultimately minimal) social adoption of masks, a late acceptance of the aerosol-spread vector, an over-emphasis on hand washing, and a far-too-late restriction on international travel and mass public events, well past the point when it could have contributed to containing the disease (vs Taiwan's much more effective management, for example).

Of course there's no guarantee that those ideas would have been accepted in time to matter had there been a freer market for views, and of course it would have opened the door to more incorrect ideas as well, but I'm of the view that it would have helped.

More importantly I think those heavy restrictions on pre-consensus ideas (as many of them would later become consensus) helped lead to a broader undermining of trust in institutions, the fallout of which we are observing today.

mac-attack 8 hours ago | parent [-]

The issues you are bringing up don't highlight that they stuck with the wrong decision, but rather that they didn't pivot to the right decision as fast as you'd like... yet your solution is bottom-up decision-making that will undoubtedly take much much longer to reach a consensus? How do you square that circle?

Experts can study and learn from their prior mistakes. Continually doing bottom-up when we have experts is inefficient and short-sighted, no? Surely you would streamline part of the process and end up in the pre-Trump framework yet again?

Also, I'm curious why you have such a rosy picture of the bottom-up alternatives? Are you forgetting about the ivermectin overdoses? 17,000 deaths related to hydroxychloroquine? The US president suggesting people drinking bleach? It is easy to cherry pick the mistakes that science makes while overlooking the noise and misinformation that worms its way into less-informed/less-educated thinkers when non-experts are given the reins

mitthrowaway2 7 hours ago | parent [-]

No, I'm not criticizing the officials for failing to reach the correct decision or adopt the correct viewpoints faster than they did. Institutions are large and risk-averse, data was incomplete, and people make mistakes.

I'm criticizing them for suppressing the dissemination of ideas that did later turn out to be correct. I hope the distinction is clear.

If you're going to impose a ban on the dissemination of ideas, you'd better be ten thousand percent sure that nothing covered by that ban later turns out to be the truth. Not a single one, not even if every other idea that got banned was correctly identified as a falsehood. Otherwise, the whole apparatus falls apart and institutions lose trust.

I'm not forgetting ivermectin overdoses. I don't believe my picture is rosy. I'm aware of all the garbage ideas out there, which is why the measles is back and all the other madness. But I'm firmly of the opinion that trying to suppress these bad ideas has only redoubled their strength in the backlash, and caused a rejection of expert knowledge altogether.