Remix.run Logo
YouTube says it'll bring back creators banned for Covid and election content(businessinsider.com)
308 points by delichon 11 hours ago | 577 comments
breadwinner 7 hours ago | parent | next [-]

I think it would be wise to listen to Nobel Prize-winning journalist Maria Ressa of The Philippines, regarding unchecked social media.

"You and I, if we say a lie we are held responsible for it, so people can trust us. Well, Facebook made a system where the lies repeated so often that people can't tell."

"Both United Nations and Meta came to the same conclusion, which is that this platform Facebook actually enabled genocide that happened in Myanmar. Think about it as, when you say it a million times... it is not just the lie but also it is laced with fear, anger and hate. This is what was prioritized in the design and the distribution on Facebook. It keeps us scrolling, but in countries like Myanmar, in countries like Philippines, in countries where institutions are weak, you saw that online violence became real world violence."

"Fear, anger, hate, lies, salaciousness, this is the worst of human nature... and I think that's what Big Tech has been able to do through social media... the incentive structure is for the worst of who we are because you keep scrolling, and the longer you keep scrolling the more money the platform makes."

"Without a shared reality, without facts, how can you have a democracy that works?"

https://www.cnn.com/2025/01/12/us/video/gps0112-meta-scraps-...

themaninthedark 6 hours ago | parent | next [-]

"Beware of he who would deny you access to information for in his heart he dreams himself your master." - Commissioner Pravin Lal, U.N. Declaration of Rights

ethbr1 6 hours ago | parent | next [-]

Full quote: "As the Americans learned so painfully in Earth's final century, free flow of information is the only safeguard against tyranny. The once-chained people whose leaders at last lose their grip on information flow will soon burst with freedom and vitality, but the free nation gradually constricting its grip on public discourse has begun its rapid slide into despotism. Beware of he who would deny you access to information, for in his heart he deems himself your master."

(Alpha Centauri, 1999, https://civilization.fandom.com/wiki/The_Planetary_Datalinks... )

HocusLocus 3 hours ago | parent | next [-]

I sit here in my cubicle, here on the motherworld. When I die, they will put my body in a box and dispose of it in the cold ground. And in the million ages to come, I will never breathe, or laugh, or twitch again. So won't you run and play with me here among the teeming mass of humanity? The universe has spared us this moment."

~Anonymous, Datalinks.

01HNNWZ0MV43FF 3 hours ago | parent | prev [-]

Anyway this video about Biden drinking the blood of Christian children is brought to you by Alpha Testerone 2 Supplements, now FDA-approved

yongjik 3 hours ago | parent | prev | next [-]

That sounds great in the context of a game, but in the years since its release, we have also learned that those who style themselves as champions of free speech also dream themselves our master.

They are usually even more brazen in their ambitions than the censors, but somehow get a free pass because, hey, he's just fighting for the oppressed.

tensor 5 hours ago | parent | prev | next [-]

There is a difference between free flow of information and propaganda. Much like how monopolies can destroy free markets, unchecked propaganda can bury information by swamping it with a data monoculture.

I think you could make a reasonable argument that the algorithms that distort social media feeds actually impede the free flow of information.

AnthonyMouse 4 hours ago | parent | next [-]

> Much like how monopolies can destroy free markets, unchecked propaganda can bury information by swamping it with a data monoculture.

The fundamental problem here is exactly that.

We could have social media that no central entity controls, i.e. it works like the web and RSS instead of like Facebook. There are a billion feeds, every single account is a feed, but you subscribe to thousands of them at most. And then, most importantly, those feeds you subscribe to get sorted on the client.

Which means there are no ads, because nobody really wants ads, and so their user agent doesn't show them any. And that's the source of the existing incentive for the monopolist in control of the feed to fill it with rage bait, which means that goes away.

The cost is that you either need a P2P system that actually works or people who want to post a normal amount of stuff to social media need to pay $5 for hosting (compare this to what people currently pay for phone service). But maybe that's worth it.

nradov 5 hours ago | parent | prev | next [-]

There is no generally accepted definition of propaganda. One person's propaganda is another person's accurate information. I don't trust politicians or social media employees to make that distinction.

ruszki an hour ago | parent | next [-]

There isn’t. Yet, everybody knows what I mean under “propaganda against immigration” (just somebody would discredit it, somebody would fight for it), and nobody claims that the Hungarian government’s “information campaign” about migrants is not fascist propaganda (except the government, obviously, but not even their followers deny it). So, yes, the edges are blurred, yet we can clearly identify some propaganda.

Also accurate information (like here is 10 videos about black killing whites) with distorted statistics (there is twice as much white on black murder) is still propaganda. But these are difficult to identify, since they clearly affect almost the whole population. Not many people even tried to fight against it. Especially because the propaganda’s message is created by you. // The example is fiction - but the direction exists, just look on Kirk’s twitter for example -, I have no idea about the exact numbers off the top of my head

tensor 5 hours ago | parent | prev | next [-]

What you think is propaganda is irrelevant. When you let people unnaturally amplify information by paying to have it forced into someone’s feed that is distorting the free flow of information.

Employees choose what you see every day you use most social media.

msandford 4 hours ago | parent [-]

Congrats! You are 99% of the way to understanding it. Now you just have to realize that "whoever is in charge" might or might not have your best interests at heart, government or private.

Anyone who has the power to deny you information absolutely has more power than those who can swamp out good information with bad. It's a subtle difference yes, but it's real.

tensor 4 hours ago | parent [-]

Banning algorithms and paid amplification is not denying you information. You can still decide for yourself who to follow, or actively look for information, actively listen to people. The difference is that it becomes your choice.

vintermann 4 hours ago | parent [-]

Well, this is about bringing back creators banned for (in YouTube's eyes) unwarranted beliefs stemming from distrust of political or medical authorities, and promoting such distrust. They weren't banned because of paid amplification.

I don't quite understand how the Ressa quote in the beginning of this thread justifies banning dissent for being too extreme. The algorithms are surely on YouTube and Facebook (and Ressa's!) side here, I'm sure they tried to downrank distrust-promoting content as much as they dared and had capabilities to, limited by e.g. local language capabilities and their users' active attempts to avoid automatic suppression - something everyone does these days.

refurb 4 hours ago | parent | prev | next [-]

And propaganda by definition isn’t false information. Propaganda can be factual as well.

fellowniusmonk 4 hours ago | parent | prev [-]

So many people have just given up on the very idea of coherent reality? Of correspondence? Of grounding?

Why? No one actually lives like that when you watch their behavior in the real world.

It's not even post modernism, it's straight up nihilism masquerading as whatever is trendy to say online.

These people accuse every one of bias while ignoring that there position comes from a place of such extreme biased it irrationally, presuppositionaly rejects the possibility of true facts in their chosen, arbitrary cut outs. It's special pleading as a lifestyle.

It's very easy to observe, model, simulate, any node based computer networks that allow for coherent and well formed data with high correspondence, and very easy to see networks destroyed by noise and data drift.

We have this empirically observed in real networks, it's pragmatic and why the internet and other complex systems run. People rely on real network systems and the observed facts of how they succeed or fail then try to undercut those hard won truths from a place of utter ignorance. While relying on them! It's absurd ideological parasitism, they deny the value of the things the demonstrably value just by posting! Just the silliest form of performative contradiction.

I don't get it. Fact are facts. A thing can be objectively true in what for us is a linear global frame. The log is the log.

Wikipedia and federated text content should never be banned, logs and timelines, data etc... but memes and other primarily emotive media is case by case, I don't see their value. I don't see the value in allowing people to present unprovable or demonstrably false data using a dogmatically, confidentally true narrative.

I mean present whatever you want but mark it as interpretation or low confidence interval vs multiple verified sources with a paper trail.

Data quality, grounding and correspondence can be measured. It takes time though for validation to occur, it's far easier to ignore those traits and just generate infinite untruth and ungrounded data.

Why do people prop up infinite noise generation as if it was a virtue? As if noise and signal epistemically can't be distinguished ever? I always see these arguments online by people who don't live that way at all in any pragmatic sense. Whether it's flat earthers or any other group who rejects the possibility of grounded facts.

Interpretation is different, but so is the intentional destruction of a shared meaning space by turning every little word into a shibboleth.

People are intentionally destroying the ability to even negotiate connections to establish communication channels.

Infinite noise leads to runaway network failure and in human systems the inevitably of violence. I for one don't like to see people die because the system has destroyed message passing via attentional ddos.

nradov 4 hours ago | parent [-]

Fortunately your biased opinion about what information has value is utterly worthless and will have zero impact on public policy. Idealized mathematical models of computer networks have no relevance to politics or freedom of expression in the real world.

boltzmann-brain 4 hours ago | parent | prev [-]

indeed, didn't YT ban a bunch of RT employees for undisclosed ties? I bet those will be coming back.

intended 39 minutes ago | parent | prev | next [-]

This is a fear of an earlier time.

We are not controlling people by reducing information.

We are controlling people by overwhelming them in it.

And when we think of a solution, our natural inclination to “do the opposite” smacks straight into our instinct against controlling or reducing access to information.

The closest I have come to any form of light at the end of the tunnel is Taiwan’s efforts to create digital consultations for policy, and the idea that facts may not compete on short time horizon, but they surely win on longer time horizons.

soganess 5 hours ago | parent | prev | next [-]

Not in the original statement, but as it referenced here, the word 'information' is doing absolutely ludicrous amounts of lifting. Hopefully it bent at the knees, because it my book it broke.

You can't call the phrase "the sky is mint chocolate chip pink with pulsate alien clouds" information.

ayntkilove 3 hours ago | parent | next [-]

You can call it data and have sufficient respect of others that they may process it into information. Too many have too little faith in others. If anything we need to be deluged in data and we will probably work it out ourselves eventually.

protocolture 3 hours ago | parent [-]

Facebook does its utmost to subject me to Tartarian, Flat Earth and Creationist content.

Yes I block it routinely. No the algo doesnt let up.

I dont need "faith" when I can see that a decent chunk of people disbelieve modern history, and aggressively disbelieve science.

More data doesnt help.

arevno 3 hours ago | parent | prev [-]

While this is true, It's also important to realize that during the great disinformation hysteria, perfectly reasonable statements like "This may have originated from a lab", "These vaccines are non-sterilizing", or "There were some anomalies of Benford's Law in this specific precinct and here's the data" were lumped into the exact same bucket as "The CCP built this virus to kill us all", "The vaccine will give you blood clots and myocarditis", or "The DNC rigged the election".

The "disinformation" bucket was overly large.

There was no nuance. No critical analysis of actual statements made. If it smelled even slightly off-script, it was branded and filed.

BrenBarn 3 hours ago | parent [-]

But it is because of the deluge that that happens. We can only process so much information. If the amount of "content" coming through is orders of magnitude larger, it makes sense to just reject everything that looks even slightly like nonsense, because there will still be more than enough left over.

themaninthedark 3 hours ago | parent [-]

So does that justify the situation with Jimmy Kimmel? After all there was a deluge of information and a lot of unknowns about the shooter but the word choice he used was very similar to the already debunked theory that it was celebratory gunfire from a supporter.

Of course not.

netsharc 2 hours ago | parent [-]

That sentence from Kimmel was IMO factually incorrect, and he was foolish to make the claim, but how is offensive towards the dead, and why is it worth a suspension?

But as we know, MAGA are snowflakes and look for anything so they can pull out their Victim Card and yell around...

Cheer2171 6 hours ago | parent | prev | next [-]

Beware he who would tell you that any effort at trying to clean up the post apocalyptic wasteland that is social media is automatically tyranny, for in his heart he is a pedophile murderer fraudster, and you can call him that without proof, and when the moderators say your unfounded claim shouldn't be on the platform you just say CENSORSHIP.

probably_wrong 2 hours ago | parent | prev | next [-]

Beware of those who quote videogames and yet attribute them to "U.N. Declaration of Rights".

BrenBarn 3 hours ago | parent | prev | next [-]

The thing is that burying information in a firehose of nonsense is just another way of denying access to it. A great way to hide a sharp needle is to dump a bunch of blunt ones on top of it.

rixed 3 hours ago | parent | prev | next [-]

Is your point that any message is information?

Without truth there is no information.

totetsu 6 hours ago | parent | prev | next [-]

Raising the noise floor of disinformation to drown out information is a way of denying access to information too..

jancsika 4 hours ago | parent | prev | next [-]

That seems to be exactly her point, no?

Imagine an interface that reveals the engagement mechanism by, say, having an additional iframe. In this iframe an LLM clicks through its own set of recommendations picked to minimize negative emotions at the expense of engagement.

After a few days you're clearly going to notice the LLM spending less time than you clicking on and consuming content. At the same time, you'll also notice its choices are part of what seems to you a more pleasurable experience than you're having in your own iframe.

Social media companies deny you the ability to inspect, understand, and remix how their recommendation algos work. They deny you the ability to remix an interface that does what I describe.

In short, your quote surely applies to social media companies, but I don't know if this is what you originally meant.

N_Lens 5 hours ago | parent | prev | next [-]

We must dissent.

idiotsecant 4 hours ago | parent | prev [-]

Sure, great. Now suppose that a very effective campaign of social destabilisation propaganda exists that poses an existential risk to your society.

What do you do?

It's easy to rely on absolutes and pithy quotes that don't solve any actual problems. What would you, specifically, with all your wisdom do?

nradov 4 hours ago | parent [-]

Let's not waste time on idle hypotheticals and fear mongering. No propaganda campaign has ever posed an existential threat to the USA. Let us know when one arrives.

CJefferson 4 hours ago | parent | next [-]

Have you seen the US recently? Just in the last couple of days, the president is standing up and broadcasting clear medical lies about autism, while a large chunk of the media goes along with him.

nradov 3 hours ago | parent [-]

I have seen the US recently. I'm not going to attempt to defend the President but regardless of whether he is right or wrong about autism this is hardly an existential threat to the Republic. Presidents have been wrong about many things before and that is not a valid justification for censorship. In a few years we'll have another president and he or she will be wrong about a whole different set of issues.

CJefferson 3 hours ago | parent | next [-]

I hope I’m wrong, but I think America is fundamentally done, because it turns out the whole “checks and balances” system turned out to be trivial to steamroll as president, and future presidents will know that now.

By done I don’t mean it won’t continue to be the worlds biggest and most important country, but I don’t expect any other country to trust America more than they have to for a 100 years or so.

nradov 2 hours ago | parent [-]

A lot of people thought that America was fundamentally done in 1861, and yet here we are. The recent fracturing of certain established institutional norms is a matter of some concern. But whether other countries trust us or not is of little consequence. US foreign policy has always been volatile, subject to the whims of each new administration. Our traditional allies will continue to cooperate regardless of trust (or lack thereof) because mutual interests are still broadly aligned and they have no credible alternative.

defrost an hour ago | parent [-]

> whether other countries trust us or not is of

some consequence. Not all consuming, but significant.

> Our traditional allies will continue to cooperate regardless of

whether they continue to include the US within that circle to the same degree, or indeed at all.

Trump's tariff's have been a boon for China's global trade connections, they continue to buy soybeans, but from new partners whereas before they sourced mainly from the US.

_DeadFred_ 6 minutes ago | parent | prev [-]

They are spreading this nonsense in part in order to hide from the fact that they refuse to release the Epstein files, something that seems to include a rather lot of high profile/high importance official potentially doing really bad things.

It's called flooding the zone, and it is a current Republican strategy to misinform, to sow defeatism in their political opposition, default/break all of the existing systems for handling politics, with the final outcome to manipulate the next election. And they publicized this yet people like you claim to think it's non issue.

rixed 2 hours ago | parent | prev [-]

It doesn't have to be national threat. Social media can be used by small organisations or even sufficiently motivated individuals to easily spread lies and slanders against individuals or group and it's close to impossible to prevent (I've been fighting some trolls threatening a group of friends on Facebook lately, and I can attest how much the algorithm favor hate speach over reason)

nradov 2 hours ago | parent [-]

That's a non sequitur. Your personal troubles are irrelevant when it comes to public policy, social media, and the fundamental human right of free expression. While I deplore hate speech, it's existence doesn't justify censorship.

vachina 6 hours ago | parent | prev | next [-]

This is why China bans western social media.

yupyupyups 6 hours ago | parent | next [-]

Say what you will about the CCP, it's naive to let a foreign nation have this much impact on your subjects. The amount of poison and political manipulation that are imported from these platform is astronomical.

scarface_74 5 hours ago | parent | next [-]

Well when the local media bends a knee and outright bribes the President (Paramount, Disney, Twitter, Facebook), why should we trust the domestic media?

nxm 4 hours ago | parent [-]

Like Biden administration pressured social media to take down information/account that went against their narrative

alphabettsy 4 hours ago | parent | next [-]

Is there a meaningful difference between pressuring and taking or threatening regulatory action? I think so.

bediger4000 4 hours ago | parent | prev | next [-]

Biden admin's bad behavior certainly allows Trump to act the same way.

If it was bad for Biden admin, it's much worse for Trump admin - he campaigned against it.

Eisenstein 4 hours ago | parent | prev [-]

Wait, are you saying that the person you are replying to is a hypocrite, or are you saying that the Biden admin set the standard for responsible government handling of media relations, or are you saying that if one administration does something bad it is ok for any other administration to do something bad, like a tit-for-tat tally system of bad things you get for free after the inauguration?

ethbr1 6 hours ago | parent | prev [-]

Instead of implementing government information control, why not invest those resources in educating and empowering ones citizenry to recognize disinformation?

BrenBarn 3 hours ago | parent | next [-]

To me this is sort of like saying why do we need seat belts when we could just have people go to the gym so they're strong off to push back an oncoming car. Well, you can't get that strong, and also you can't really educate people well enough to reliably deal with the full force of the information firehose. Even people who are good at doing it do so largely by relying on sources they've identified as trustworthy and thus offloading some of the work to those. I don't think there's anyone alive who could actually distinguish fact from fiction if they had to, say, view every Facebook/Twitter/Reddit/everything post separately in isolation (i.e., without relying on pre-screening of some sort).

And once you know you need pre-screening, the question becomes why not just provide it instead of making people hunt it down?

rixed 2 hours ago | parent | prev | next [-]

Instead of investing resources in education, why not let people discover by themselves the virtues of education?

Sarcasm aside, we tend to focus too much on the means and too little on the outcomes.

CJefferson 4 hours ago | parent | prev | next [-]

Because no one person can fight against a trillion dollar industry who has decided misinformation makes the biggest profit.

How am I supposed to learn what’s going on outside my home town without trusting the media?

beepboopboop 5 hours ago | parent | prev | next [-]

That’s hundreds of millions of people in the US, of varying ages and mostly out of school already. Seems like a good thing to try but I’d imagine it doesn’t make a tangible impact for decades.

xracy 3 hours ago | parent | prev | next [-]

'An ounce of prevention is worth a pound of the cure.'

It's so much easier to stop one source than it is to (checks notes) educate the entire populace?!? Gosh, did you really say that with a straight face? As if education isn't also under attack?

Broken_Hippo 3 hours ago | parent | prev | next [-]

Because it isn't that simple.

If we could just educate people and make sure they don't fall for scams, we'd do it. Same for disinformation.

But you just can't give that sort of broad education. If you aren't educated in medicine and can't personally verify qualifications of someone, you are going to be at a disadvantage when you are trying to tell if that health information is sound. And if you are a doctor, it doesn't mean you know about infrastructure or have contacts to know what is actually happening in the next state or country over.

It's the same with products, actually. I can't tell if an extension cord is up to code. The best that I can realistically do is hope the one I buy isn't a fake and meets all of the necessary safety requirements. A lot of things are like this.

Education isn't enough. You can't escape misinformation and none of us have the mental energy to always know these things. We really do have to work the other way as well.

idiotsecant 4 hours ago | parent | prev | next [-]

Because you want to use it yourself. You can't vaccinate if you rely on the disease to maintain power. You can't tell people not to be afraid of people different than themselves if your whole party platform is being afraid of people different than yourself.

erxam 5 hours ago | parent | prev [-]

Sorry, 'recognizing disinformation'? You must have meant 'indoctrination'.

(They don't necessarily exclude each other. You need both positive preemptive and negative repressive actions to keep things working. Liberty is cheap talk when you've got a war on your hands.)

nradov 5 hours ago | parent | prev [-]

China reflexively bans anything that could potentially challenge Chairman Xi's unchecked authority and control over the information flow.

stinkbeetle 6 hours ago | parent | prev | next [-]

I think it would be even wiser to start by holding to account the politicians, corporations, and government institutions regarding their unchecked lies corruption and fraud.

But no, yet again the blame is all piled on to the little people. Yes, it's us plebs lying on the internet who are the cause of all these problems and therefore we must be censored. For the greater good.

I have an alternative idea, let's first imprison or execute (with due process) politicians, CEOs, generals, heads of intelligence and other agencies and regulators, those found to have engaged in corrupt behavior, lied to the public, committed fraud, insider trading, fabricated evidence to support invading other countries, engage in undeclared wars, ordered extrajudicial executions, colluded with foreign governments to hack elections, tax evasion, etc. Then after we try that out for a while and if it has not improved things, then we could try ratcheting up the censorship of plebs. Now one might argue that would be a violation of the rights of those people to take such measures against them, but that is a sacrifice I'm willing to make. Since We Are All In This Together™, they would be willing to make that sacrifice too. And really, if they have nothing to hide then they have nothing to fear.

When you get people like Zuckerberg lying to congress, it's pretty difficult to swallow the propaganda claiming that it's Joe Smith the unemployed plumber from West Virginia sharing "dangerous memes" with his 12 friends on Facebook that is one of the most pressing concerns.

vintermann 3 hours ago | parent | prev | next [-]

Exactly what are you trying to say about unbanning YouTubers here?

afavour 3 hours ago | parent [-]

That it could be dangerous to readmit people who broadcast disinformation? The connection seemed pretty clear to me.

vintermann 2 hours ago | parent [-]

I certainly guessed that was what you wanted to say. Funny how polarization makes everything predictable.

But what I just realized is that you don't explicitly say it, and certainly make no real argument for it. Ressa laments algorithmic promotion of inflammatory material, but didn't say "keep out anti-government subversives who spread dangerous misinformation" - which is good, because

1. We can all see how well the deplatforming worked - Trump is president again, and Kennedy is health secretary.

2. In the eyes of her government, she was very much such a person herself, so it would have been pretty bizarre thing of her to say.

Ironically, your post is very much an online "go my team!" call, and a good one too (top of the thread!). We all understand what you want and most of us, it seems, agree. But you're not actually arguing for the deplatforming you want, just holding up Ressa as a symbol for it.

_dain_ 6 hours ago | parent | prev | next [-]

>unchecked social media

Passive voice. Who exactly is supposed to do the "checking" and why should we trust them?

breadwinner 6 hours ago | parent [-]

Citizens. Through lawsuits. Currently we can't because of Section 230.

nradov 5 hours ago | parent | next [-]

Nonsense. If social media users engage in fraud, slander, or libel then you can still hold them accountable through a civil lawsuit. Section 230 doesn't prevent this.

trhway 5 hours ago | parent | prev [-]

The "editorializing" may possibly be applied i think (not a lawyer) when the platform's manipulation of what a user sees is based on content. And the Youtube's banning of specific Covid and election content may be such an "editorializing", and thus Youtube may not have Section 230 protection at least in those cases.

nradov 5 hours ago | parent [-]

Have you even read Section 230? Editorializing is irrelevant.

refurb 4 hours ago | parent | prev | next [-]

The problem is not the content, the problem is people believing things blindly.

The idea that we need to protect people from “bad information” is a dark path to go down.

BrenBarn 3 hours ago | parent [-]

I don't see it so much as protecting people from bad information as protecting people from bad actors, among whom entities like Facebook are prominent. If people want to disseminate quackery they can do it like in the old days by standing on a street corner and ranting. The point is that the mechanisms of content delivery amplify the bad stuff.

refurb an hour ago | parent [-]

It’s a terrible idea and creates more problems than it solves.

You eliminate the good and the bad ideas. You eliminate the good ideas that are simple “bad” because it upsets people with power. You eliminate the good ideas that are “bad” simply because they are deemed too far out the Overton window.

And worst of all, it requires some benevolent force to make the call between good and bad, which attracts all sorts of psychopaths hungry for power.

StanislavPetrov 6 hours ago | parent | prev | next [-]

>"You and I, if we say a lie we are held responsible for it, so people can trust us."

I don't know how it works in The Philippines, but in the USA the suggestion that media outlets are held responsible for the lies that they tell is one of the most absurd statements one could possibly make.

lfpeb8b45ez 6 hours ago | parent [-]

How about InfoWars?

StanislavPetrov an hour ago | parent | next [-]

I was referring more to established Media that people consider credible like the NBC, CBS, The Guardian, The New York Times, the Wall Street Journal, The Atlantic, etc. The fact that the only person in "media" who has been severely punished for their lies is a roundly despised figure (without any credibility among established media or the ruling class) is not a ringing endorsement for the system. While the lies of Jones no doubt caused untold hardship for the families of the victims, they pale in comparison to the much more consequential lies told by major media outlets with far greater influence.

When corporate media figures tell lies that are useful to the establishment, they are promoted, not called to account.

In 2018 Luke Harding at the Guardian lied and published a story that "Manafort held secret talks with Assange in Ecuadorian embassy" (headline later amended with "sources say" after the fake story was debunked) in order to bolster the Russiagate narrative. It was proven without a shadow of a doubt that Manafort never went to the Embassy or had any contact at all with Assange (who was under blanket surveillance), at any time. However, to this day this provably fake story remains on The Guardian website, without any sort of editor's note that is it false or that it was all a pack of lies!(1) No retraction was ever issued. Luke Harding remains an esteemed foreign correspondent for The Guardian.

In 2002, Jonah Golberg told numerous lies in a completely false article in The New Yorker that sought to establish a connection between the 9/11 attacks and Saddam Hussein called, "The Great Terror".(2) This article was cited repeatedly during the run up to the war as justification for the subsequent invasion and greatly helped contribute to an environment where a majority of Americans thought that Iraq was linked to Bin Laden and the 9/11 attackers. More than a million people were killed, in no small part because of his lies. And Goldberg? He was promoted to editor-in-chief of The Atlantic, perhaps the most prestigious and influential journal in the country. He remains in this position today.

There are hundreds, if not thousands, of similar examples. The idea suggested in the original OP that corporate/established media is somehow more credible or held to a higher standard than independent media is simply not true. Unfortunately there are a ton of lies, falsehoods and propaganda out there, and it is up to all of us to be necessarily skeptical no matter where we get our information and do our due diligence.

1. https://www.theguardian.com/us-news/2018/nov/27/manafort-hel...

2. https://www.newyorker.com/magazine/2002/03/25/the-great-terr...

anonymousiam 5 hours ago | parent | prev [-]

A sympathetic jury can be an enemy of justice.

I'm not an Alex Jones fan, but I don't understand how a conspiracy theory about the mass shooting could be construed as defamation against the parents of the victims. And the $1.3B judgement does seem excessive to me.

AlexandrB 5 hours ago | parent | next [-]

You should read up on some details. The defamation claim is because Alex Jones accused the parents of being actors who are part of staging the false flag. The huge judgement is partly because Alex Jones failed to comply[1][2] with basic court procedure like discovery in a timely way so a default judgement was entered.

Despite his resources, Alex Jones completely failed to get competent legal representation and screwed himself. He then portrayed himself as the victim of an unjust legal system.

[1] https://www.npr.org/2021/11/15/1055864452/alex-jones-found-l...

> Connecticut Superior Court Judge Barbara Bellis cited the defendants' "willful noncompliance" with the discovery process as the reasoning behind the ruling. Bellis noted that defendants failed to turned over financial and analytics data that were requested multiple times by the Sandy Hook family plaintiffs.

[2] https://lawandcrime.com/high-profile/judge-rips-alex-jones-c...

> Bellis reportedly said Jones' attorneys "failure to produce critical material information that the plaintiffs needed to prove their claims" was a "callous disregard of their obligation," the Hartford Courant reported.

tbrownaw 5 hours ago | parent [-]

> The huge judgement is partly because Alex Jones failed to comply with basic court procedure like discovery in a timely way so a default judgement was entered.

Yeah. Reufsing to cooperate with the court has to always be at least as bad as losing your case would have been.

protocolture 3 hours ago | parent | prev [-]

The specific conspiracy theory implied fraud and cover up on behalf of the parents. Lmao.

trhway 7 hours ago | parent | prev | next [-]

Censorship works both ways. When i tried speaking against violence and genocide perpetrated by Russia in Ukraine i was shut down on Linkedin.

Even here on HN, i was almost banned when i said about children abduction by Russia https://news.ycombinator.com/item?id=33005062 - the crime that half year later ICC wrote the order against Putin.

breadwinner 7 hours ago | parent | next [-]

You know how this used to work in the old days? Instead of publishing allegations yourself, you would take your story to a newspaper reporter. The reporter will then do the investigations then, if there is solid evidence, the story will be published in the newspaper. At that point the newspaper company is standing behind the story, and citizens know the standing of the newspaper in their community, and how much credence to give to the story, based on that. Social media destroyed this process, now anyone can spread allegations at lightning speed on a massive scale without any evidence to back it up. This has to stop. We should return to the old way, it wasn't perfect, but it worked for 100s of years. Repealing Section 230 will accomplish this.

themaninthedark 5 hours ago | parent | next [-]

I remember a story that was investigated and then published...it was spread far and wide. The current president of the US stole the election and our biggest adversary has videos of him in compromising positions. Then debunked. (Steele dossier) https://www.thenation.com/article/politics/trump-russiagate-...

I remember a story that was investigated and then published...for some reason it was blocked everywhere and we were not allowed to discuss the story or even link to the news article. It "has the hallmarks of a Russian intelligence operation."(Hunter Biden Laptop) Only to come out that it was true: https://www.msn.com/en-us/news/politics/fbi-spent-a-year-pre...

I would rather not outsource my thinking or my ability to get information to approved sources. I have had enough experience with gell-mann amnesia to realize they have little to no understanding of the situation as well. I may not be an expert in all domains but while I am still free at least I can do my best to learn.

scarface_74 5 hours ago | parent [-]

You seem to be forgetting that whole “election was stolen” lie the President told that had thousands of domestic terrorist invading the Capital and then pardoned?

But keep worrying about an inconsequential civilian’s laptop.

themaninthedark 4 hours ago | parent [-]

Forest for the trees.

Don't take my comment as a declaration for Trump and all he stands for.

My parent had posted "You know how this used to work in the old days? Instead of publishing allegations yourself, you would take your story to a newspaper reporter. The reporter will then do the investigations then, if there is solid evidence, the story will be published in the newspaper. At that point the newspaper company is standing behind the story, and citizens know the standing of the newspaper in their community, and how much credence to give to the story, based on that."

Rather than call it an argument to authority, which it is very close to, I decided to highlight two cases where this authority that we are supposed to defer to was wrong.

Perhaps a better and direct argument would be to point out that during the COVID pandemic; Youtube, Facebook and Twitter were all banning and removing posts from people who had heterodox opinions, those leading the charge with cries of "Trust the Science".

This run contrary of what science and the scientific process is, Carl Segan saying it better than I "One of the great commandments of science is, 'Mistrust arguments from authority.' ... Too many such arguments have proved too painfully wrong. Authorities must prove their contentions like everybody else."

Now that I have quoted a famous scientist in a post to help prove my point about how arguments from authority are invalid, I shall wait for the collapse of the universe upon itself.

nradov 5 hours ago | parent | prev | next [-]

It never worked. Newspapers in the old days frequently printed lies and fake news. They usually got away with it because no one held them accountable.

itbeho 4 hours ago | parent [-]

William Randolph Hearst and the Spanish-American war come to mind.

pkphilip 2 hours ago | parent | prev | next [-]

What happens when the press refuses to publish anything which doesn't align with their financial or political interest?

trhway 7 hours ago | parent | prev | next [-]

>At that point the newspaper company is standing behind the story

the newspaper company is the bottleneck that the censors can easily tighten like it was say in USSR. Or like even FCC today with the media companies like in the case of Kimmel.

Social media is our best tool so far against censorship. Even with all the censorship that we do have in social media, the information still finds a way due to the sheer scale of the Internet. That wasn't the case in the old days when for example each typewritter could be identified by unique micro-details of the shape of its characters.

>Social media destroyed this process, now anyone can spread allegations at lightning speed on a massive scale without any evidence to back it up.

Why to believe anything not accompanied by evidence? The problem here is with the news consumer. We teach children to not stick fingers into electricity wall socket. If a child would still stick the fingers there, are you going to hold the electric utility company responsible?

>This has to stop. We should return to the old way, it wasn't perfect, but it worked for 100s of years.

The same can be said about modern high density of human population, transport connections and infectious decease spreading. What you suggest is to decrease the population and confine the rest preventing any travel like in the "old days" (interesting that it took Black Death some years to spread instead of days it would have taken today, yet it still did spread around all the known world). We've just saw how it works in our times (and even if you say it worked then why aren't we still doing it today?). You can't put genie back into the bottle and stop the progress.

>Repealing Section 230 will accomplish this.

Yes, good thing people didn't decided back then to charge the actual printer houses with lies present in the newspapers they printed.

petermcneeley 6 hours ago | parent | prev | next [-]

> We should return to the old way, it wasn't perfect, but it worked for 100s of years

At this stage you are clearly just trolling. Are you even aware of the last 100s of years? From Luther to Marx? You are not acting in good faith. I want nothing to do with your ahistorical worldview.

mensetmanusman 7 hours ago | parent | prev [-]

There is no way to go back to this. It’s about as feasible as getting rid of vehicles.

breadwinner 7 hours ago | parent [-]

I am not saying we should go back to physical newspapers printed on paper. News can be published online... but whoever is publishing it has to stand behind it, and be prepared to face lawsuits from citizens harmed by false stories. This is feasible, and it is the only solution to the current mess.

nradov 4 hours ago | parent | next [-]

It's horrifying that anyone would believe that censorship and control over news would be a solution to anything. The naivety of your comment is in itself an indictment of our collective failure to properly educate the polity in civics.

knome 4 hours ago | parent | prev [-]

A determined instigator could easily continue pushing modern yellow journalism with little problem under the system you propose.

They simply need choose which negative stories they print, which opinions they run. How do you frame misrepresentation vs a differing point of view? How do you call out mere emphasis on which true stories are run. Truths are still truths, right?

It's not infrequent today to see political opinions washed through language to provide reasonable deniability by those using it.

Hell, it's not infrequent to see racism, bigotry and hate wrapped up to avoid the key phrases of yesteryear, instead smuggling their foulness through carefully considered phrases, used specifically to shield those repeating them from being called out.

'No no no. Of course it doesn't mean _that_, you're imagining things and making false accusations.'

King-Aaron 7 hours ago | parent | prev [-]

I can think of another hot-potato country that will get posts nerfed from HN and many others

gchamonlive 7 hours ago | parent | prev [-]

That's the evil genius behind the general movement in the world to discredit democratic institutions and deflate the government.

Who would hold Meta accountable for the lies it helps spread and capitalize upon them if not the government.

So by crippling democratic institutions and dwarfing the government to the point of virtual non-existence, all in the name of preserving freedom of speech and liberalism -- and in the process subverting both concepts -- elected leaders have managed to neutralize the only check in the way of big corps to ramp up this misinformation machine that the social networks have become.

diego_sandoval 9 hours ago | parent | prev | next [-]

At the time, YouTube said: “Anything that would go against World Health Organization recommendations would be a violation of our policy.” [1] which, in my opinion, is a pretty extreme stance to take, especially considering that the WHO contradicted itself many times during the pandemic.

[1] https://www.bbc.com/news/technology-52388586

danparsonson 8 hours ago | parent | next [-]

> the WHO contradicted itself many times during the pandemic

Did they? I remember them revising their guidance, which seems like something one would expect during an emerging crisis, but I don't remember them directly contradicting themselves.

rogerrogerr 7 hours ago | parent | next [-]

As super low hanging fruit:

June 8, 2020: WHO: Data suggests it's "very rare" for coronavirus to spread through asymptomatics [0]

June 9, 2020: WHO expert backtracks after saying asymptomatic transmission 'very rare' [1]

0: https://www.axios.com/2020/06/08/who-coronavirus-asymptomati... 1: https://www.theguardian.com/world/2020/jun/09/who-expert-bac...

Of course, if we just take the most recent thing they said as "revised guidance", I guess it's impossible for them to contradict themselves. Just rapidly re-re-re-revised guidance.

margalabargala 7 hours ago | parent | next [-]

The difference between a contradiction and a revision is the difference between parallel and serial.

I'm not aware that the WHO ever claimed simultaneously contradictory things.

Obviously, rapid revisions during a period of emerging data makes YouTube's policy hard to enforce fairly. Do you remove things that were in line with the WHO when they were published? When they were made? Etc

dazilcher an hour ago | parent | next [-]

> I'm not aware that the WHO ever claimed simultaneously contradictory things.

Whether they did or not is almost irrelevant: information doesn't reach humans instantaneously, it takes time to propagate through channels with varying latency, it gets amplified/muted depending on media bias, people generally have things going on in life other than staying glued to new sources, etc.

If you take a cross sample you're guaranteed to observe contradictory "parallel" information even if the source is serially consistent.

zmgsabst 7 hours ago | parent | prev | next [-]

You’re removing people who were correct before the WHO revised their position.

margalabargala 7 hours ago | parent [-]

That is the problem I discuss in my third paragraph, yes.

brailsafe 6 hours ago | parent | prev | next [-]

> The difference between a contradiction and a revision is the difference between parallel and serial.

Eh, ya kind of, but it seems more like the distinction between parallel and concurrent in this case. She doesn't appear to be wrong in that instance while at the same time the models might have indicated otherwise, being an apparent contradiction and apparently both true within the real scope of what could be said about it at that time.

naasking 7 hours ago | parent | prev | next [-]

A censorship policy that changes daily is a shitty policy. If people on June 8th criticized that official position before they reversed the next day, do you think it was right or a good idea for them to be censored?

xracy 3 hours ago | parent | next [-]

That's a nice hypothetical. Do you have any examples of people getting censored for WHO changing their stance?

Like, we're getting pretty nuanced here pretty fast, it would be nice to discuss this against an actual example of how this was enforced rather than being upset about a hypothetical situation where we have no idea how it was enforced.

margalabargala 7 hours ago | parent | prev | next [-]

> A censorship policy that changes daily is a shitty policy.

Yes.

> If people on June 8th criticized that official position before they reversed the next day, do you think it was right or a good idea for them to be censored?

Obviously not. Like I pointed out to the other commenter, if you were to read the comment of mine you replied to, I have a whole paragraph discussing that. Not sure why you're asking again.

gjsman-1000 7 hours ago | parent [-]

Screw that; and HN needs a place to frame the most incredible takes so we never forget.

margalabargala 7 hours ago | parent [-]

The person I replied to edited their comment after I replied making it look like I was saying the opposite of what I was. Is that what you were referring to?

IIAOPSW an hour ago | parent | prev [-]

There's only two ways one could have been contradicting information from the WHO which was later revised prior to them revising it. Either:

1. They really did have some insight or insider knowledge which the WHO missed and they spoke out in contradiction of officialdom in a nuanced and coherent way that we can all judge for ourselves.

2. They in fact had no idea what they were talking about at the time, still don't, and lucked into some of it being correct later on.

I refer to Harry Frankfurt's famous essay "On Bullshit". His thesis is that bullshit is neither a lie nor the truth but something different. Its an indifference to the factuality of ones statements altogether. A bullshit statement is one that is designed to "sound right" for the context it is used, but is actually just "the right thing to say" to convince people and/or win something irrespective of if it is true or false.

A bullshit statement is more dangerous than a lie, because the truth coming to light doesn't always expose a bullshitter the way it always exposes a lie. A lie is always false in some way, but bullshit is uncorrelated with truth and can often turn out right. Indeed a bullshitter can get a lucky streak and persist a very long time before anyone notices they are just acting confident about things they don't actually know.

So in response.

It is still a good idea to censor the people in category two. Even if the hypothetical person in your example turned out to get something right that the WHO initially got wrong, they were still spreading false information in the sense that they didn't actually know the WHO was wrong at the time when they said it. They were bullshitting. Having a bunch of people spreading a message of "the opposite of what public health officials tell you" is still dangerous and bad, even if sometimes in retrospect that advice turns out good.

People in category one were few and far between and rarely if ever censored.

natch 5 hours ago | parent | prev [-]

They would not utter the word Taiwan. That’s an huge red flag that they are captured and corrupt. Are you claiming this has changed?

margalabargala 4 hours ago | parent [-]

Did you reply to the wrong comment? We're discussing whether the WHO put out simultaneously contradictory information. Whether the WHO's politics matches your preferred politics for southeast Asia doesn't seem topical?

danparsonson 6 hours ago | parent | prev | next [-]

OK and if you said something that you later realised to be wrong, would you be contradicting yourself by correcting it? What should they have done in this situation? People do make mistakes, speak out of turn, say the wrong thing sometimes; I don't think we should criticise someone in that position who subsequently fixes their error. And within a couple of days in this case! That's a good thing. They screwed up and then fixed it. What am I missing here?

stinkbeetle 6 hours ago | parent | next [-]

When you're a global organization who is pushing for the censorship of any dissent or questioning of your proclamations, it's really on you not to say one thing one day then the opposite the next day, isn't it? They could have taken some care to make sure their data and analysis was sound before making these kinds of statements.

If you posted to YouTube that it is very rare for asymptomatics to spread the disease, would you be banned? What if you posted it on the 9th in the hours between checking their latest guidance and their guidance changing? What if you posted it on the 8th but failed to remove it by the 10th?

What if you disagreed with their guidance they gave on the 8th and posted something explaining your stance? Would you still get banned if your heresy went unnoticed by YouTube's censors until the 10th at which time it now aligns with WHO's new position? Banned not for spreading misinformation, but for daring to question the secular high priests?

danparsonson an hour ago | parent | next [-]

Good lord, refer to my original comment. The person I was replying to claimed the WHO contradicted themselves, I asserted that they did not. All the rest of this is your own addition.

pests 5 hours ago | parent | prev [-]

Did the WHO push for censorship or was it YouTube/Google/others?

It was a novel time and things were changing daily. Care needs to be taken yes, but it’s also weighed against clear and open communication. People were very scared. Thinking they would die. I don’t mind having up-to-date information even if it were changing daily.

stinkbeetle 4 hours ago | parent [-]

> Did the WHO push for censorship or was it YouTube/Google/others?

Quite likely the WHO directly or by proxy with members who are also part of bureaucracy and governments in member states.

There is no question the WHO loves censorship and take an authoritarian approach to their "authority".

https://healthpolicy-watch.news/the-world-health-organizatio...

https://www.theguardian.com/world/2020/nov/13/who-drops-cens...

If corporations start adopting policies that censor anything contradicting WHO, there would be a larger onus on a claim that they were not involved in that censorship action, in my opinion.

If it wasn't them and it was all Google's idea to censor this without any influence from governments or these organizations, which is quite laughable to think but let's entertain the idea -- the WHO still should not have responded as it did with these knee jerk reactions, and also it should have been up to Google to ensure the did not use as their "source of truth" an organization that behaved in that way.

> It was a novel time

It wasn't really that novel since there have been centuries to study pandemics and transmissible diseases of all kinds, and there have even been many others of slightly less scale happen.

> and things were changing daily.

Things always change daily. Covid was not particularly "fast moving" at the time. It's not like new data was coming in that suddenly changed things day to day. It just progressed over the course of months and years. It appeared to be wild and fast moving and ever changing mainly because of the headless-chicken response from organizations like this.

> Care needs to be taken yes, but it’s also weighed against clear and open communication. People were very scared. Thinking they would die.

People were very scared because of the fear campaign, and the imbecilic and contradictory responses from these organizations.

Not that it was nothing to be afraid of, but people should have calmly been given data and advice and that's it. Automobiles, heart attacks, and cancer kill lots of people too, and should be taken very seriously and measures taken to reduce risk but even so it would be stupid to start screaming about them and cause panic.

> I don’t mind having up-to-date information even if it were changing daily.

It's not having data that is the problem, it is jumping the gun with analysis and findings and recommendations based on that data, then having to retract it immediately and say the opposite.

Jensson 16 minutes ago | parent [-]

> If it wasn't them and it was all Google's idea to censor this without any influence from governments or these organizations

We actually has the emails the Biden administration sent to Youtube, here is a quote they sent:

  "we want to be sure that you have a handle on vaccine hesitancy generally and are working toward making the problem better. This is a concern that is shared at the highest (and I mean highest) levels of the White House"
That is a very clear threat. "We want to make sure you ...", and then saying this threat is done with the highest authority of the USA, so better get working on what we want.

There are hundreds of such emails detailed in this report if you want to read what they sent to the different tech companies to make them so scared that they banned anything related to Covid: https://judiciary.house.gov/media/press-releases/weaponizati...

f33d5173 6 hours ago | parent | prev [-]

Them correcting themselves isn't a bad thing. The point is that it would be absolutely retarded to require that people never disagree with the WHO. Please try and follow the thread of the conversation and not take it down these pointless tangents.

danparsonson an hour ago | parent [-]

No, the point (and my original reply) is that correcting themselves is not the same as contradicting themselves. I didn't say anything about never disagreeing with them, and it's not a tangent, I'm replying to replies to my original comment.

brookst 7 hours ago | parent | prev | next [-]

Is there a difference between an expert opinion in the midst of a pandemic and an organizational recommendation?

rogerrogerr 7 hours ago | parent [-]

Sure seemed like you'd get kicked off YouTube equally fast for questioning either one.

thinkingtoilet 7 hours ago | parent [-]

Oh stop it. There was rampant misinformation on youtube all through out the pandemic.

rogerrogerr 5 hours ago | parent | next [-]

Like that the novel coronavirus first seen in Wuhan may have come from the Wuhan Novel Coronavirus Lab?

Yeah, that was banished to the dark corners of Reddit until Jon Stewart said the obvious, and he was considered too big to censor.

Dylan16807 5 hours ago | parent [-]

No. People were talking about it all over.

mensetmanusman 7 hours ago | parent | prev [-]

And unfortunately much of it was spread by official institutions like the WHO.

pylotlight 6 hours ago | parent [-]

and the governments, all of which who were bought and paid for by...... big pharma. This comment was brought to you by Pfizer

1oooqooq 7 hours ago | parent | prev [-]

they also changed the symptoms definitions, so ...

danparsonson 6 hours ago | parent [-]

So as researchers learned more about COVID the WHO should've just ignored any new findings and stuck to their initial guidance? This is absurd.

wdr1 4 hours ago | parent | prev [-]

> Did they?

They said it was a fact that COVID is NOT airborne. (It is.)

Not they believed it wasn't airborne.

Not that data was early but indicated it wasn't airborne.

That it was fact.

In fact, they published fact checks on social media asserting that position. Here is one example on the official WHO Facebook page:

https://www.facebook.com/WHO/posts/3019704278074935/?locale=...

danparsonson an hour ago | parent [-]

None of that argues that they contradicted themselves. You and several others have just hijacked this thread to pile on the WHO.

Argue that they were incompetent in their handling of it, sure, whatever. That's not the comment you're replying to.

kevin_thibedeau 7 hours ago | parent | prev | next [-]

Don't forget that they ban-hammered anyone who advanced the lab leak theory because a global entity was pulling the strings at the WHO. I first heard about Wuhan in January of 2020 from multiple Chinese nationals who were talking about the leak story they were seeing in uncensored Chinese media and adamant that the state media story was BS. As soon as it blew up by March, Western media was manipulated into playing the bigotry angle to suppress any discussion of what may have happened.

zeven7 3 hours ago | parent | next [-]

I believe having Trump as president exacerbated many, many things during that time, and this is one example. He was quick to start blaming the "Chinese", he tried to turn it into a reason to dislike China and Chinese people, because he doesn't like China, and he's always thinking in terms of who he likes and dislikes. This made it hard to talk about the lab leak hypothesis without sounding like you were following Trump in that. If we had had a more normal president, I don't think this and other issues would have been as polarized, and taking nuanced stances would have been more affordable.

cmilton 7 hours ago | parent | prev | next [-]

Because that is a bold claim to make. There is no proof of a lab leak and evidence leads to the wet market as the source. There is a debate out there for 100k to prove this. Check it out.

pton_xd 5 hours ago | parent | next [-]

> Because that is a bold claim to make. There is no proof of a lab leak and evidence leads to the wet market as the source.

A novel coronavirus outbreak happens at the exact location as a lab performing gain of function research on coronaviruses... but yeah, suggesting a lab leak is outlandish, offensive even, and you should be censored for even mentioning that as a possibility. Got it.

This line of thinking didn't make sense then and still doesn't make sense now.

aeternum 3 hours ago | parent [-]

Yes, Jon Stewart really nailed this point, it's a great clip and worth the re-watch.

mayama 5 hours ago | parent | prev | next [-]

> There is no proof of a lab leak and evidence leads to the wet market as the source

Because WHO worked with CPC to bury evidence and give clean chit to wuhan lab. There was some pressure building up then for international teams to visit wuhan lab and examine data transparently. But, with thorough ban of lab leak theory, WHO visited china and gave clean chit without even visiting wuhan lab or having access to lab records. The only place that could prove this definitively buried all records.

themaninthedark 3 hours ago | parent | prev | next [-]

It is not as cut and dry as you think. Also it is rather hard to get any evidence when you aren't allow to visit the "scene of the crime" so to speak and all data is being withheld.

https://www.nytimes.com/interactive/2024/06/03/opinion/covid...

Even Dr Fauci said in 2021 he was "not convinced" the virus originated naturally. That was a shift from a year earlier, when he thought it most likely Covid had spread from animals to humans.

https://www.deseret.com/coronavirus/2021/5/24/22451233/coron...

(..February 2023..) The Department of Energy, which oversees a network of 17 U.S. laboratories, concluded with “low confidence” that SARS-CoV-2 most likely arose from a laboratory incident. The Federal Bureau of Investigation said it favored the laboratory theory with “moderate” confidence. Four other agencies, along with a national intelligence panel, still judge that SARS-CoV-2 emerged from natural zoonotic spillover, while two remain undecided.

https://www.nejm.org/doi/full/10.1056/NEJMp2305081

WHO says that "While most available and accessible published scientific evidence supports hypothesis #1, zoonotic transmission from animals, possibly from bats or an intermediate host to humans, SAGO is not currently able to conclude exactly when, where and how SARS-CoV-2 first entered the human population."

However "Without information to fully assess the nature of the work on coronaviruses in Wuhan laboratories, nor information about the conditions under which this work was done, it is not possible for SAGO to assess whether the first human infection(s) may have resulted due to a research related event or breach in laboratory biosafety."

https://www.who.int/news/item/27-06-2025-who-scientific-advi...

WHO paraphrased: We have no data at all about the Wuhan Laboratory so we can not make a conclusion on that hypothesis. Since we have data relating to natural transmission from animals we can say that situation was possible.

mensetmanusman 7 hours ago | parent | prev | next [-]

It’s not a bold claim. The Fauci emails showed he and others were discussing this as a reasonable possibility.

natch 5 hours ago | parent | prev | next [-]

But there is no proof of any real wet lab connection and evidence points to the lab as a source.

potsandpans 7 hours ago | parent | prev | next [-]

The topic at hand is not whether it's a bold claim to make. The question is: should organizations that control a large portion of the world's communication channels have the ability to unilaterally define the tone and timber of a dialog surrounding current events?

To the people zealously downvoting all of these replies: defend yourselves. What about this is not worthy of conversation?

I'm not saying that I support lab leak. The observation is that anyone that discussed the lab leak hypothesis on social media had content removed and potentially were banned. I am fundamentally against that.

If the observation more generally is that sentiments should be censored that can risk peoples lives by influencing the decisions they make, then let me ask you this:

Should Charlie Kirk have been censored? If he were, he wouldn't have been assassinated.

blooalien an hour ago | parent [-]

> "Should Charlie Kirk have been censored? If he were, he wouldn't have been assassinated."

On the other hand, if he were, then whoever censored him might have just as easily become the target of some other crazy, because that appears to be the world we live in now. Something's gotta change. This whole "us vs them" situation is just agitating the most extreme folks right over the edge of sanity into "Crazy Town". Wish we could get back to bein' that whole "One Nation Under God" "Great Melting Pot" "United States" they used to blather on about in grade-school back in the day, but that ship appears to have done sailed and then promptly sunk to the bottom... :(

naasking 7 hours ago | parent | prev [-]

> Because that is a bold claim to make. There is no proof of a lab leak and evidence leads to the wet market as the source.

It was not a bold claim at the time. Not only was there no evidence that it was the wet market at the time, the joint probability of a bat coronavirus outbreak where there were few bat caves but where they were doing research on bat coronaviruses is pretty damning. Suppressing discussion of this very reasonable observation was beyond dumb.

tbrownaw 5 hours ago | parent [-]

> Suppressing discussion of this very reasonable observation was beyond dumb.

I thought it wasn't so much an error as a conflict of interest.

amanaplanacanal 7 hours ago | parent | prev | next [-]

My memory is that the "lab leak" stuff I saw back then was all conspiracy theories about how it was a Chinese bioweapon.

Eventually I started seeing some serious discussion about how it might have been accidentally created through gain of function research.

api 7 hours ago | parent | next [-]

I’m undecided on the issue, but… if I were trying to cover up an accidental lab leak I’d spread a story that it was a giant conspiracy to create a bio weapon. For extra eye rolls I’d throw in some classic foil hat tropes like the New World Order or the International Bankers.

If it was a lab leak, by far the most likely explanation is that someone pricked themselves or caught a whiff of something.

A friend of mine who lived in China for a while and is familiar with the hustle culture there had his own hypothesis. Some low level techs who were being given these bats and other lab animals to euthanize and incinerate were like “wait… we could get some money for these over at the wet market!”

naasking 7 hours ago | parent | prev [-]

> My memory is that the "lab leak" stuff I saw back then was all conspiracy theories about how it was a Chinese bioweapon.

No, that was just the straw man circulated in your echo chamber to dismiss discussion. To be clear, there were absolutely people who believed that, but the decision to elevate the nonsense over the serious discussion is how partisan echo chambers work.

gusgus01 3 hours ago | parent [-]

That was one of the main arguments by some of my coworkers and friends when COVID came up socially. I specifically remember a coworker at a FAANG saying something along the lines of "It's a bioweapon, so it's basically an act of war".

potsandpans 7 hours ago | parent | prev | next [-]

I called this out in this thread and was immediately downvoted

McGlockenshire 7 hours ago | parent | prev [-]

> because a global entity was pulling the strings at the WHO'

excuse me I'm sorry what?

hyperhopper 9 hours ago | parent | prev | next [-]

The united states also said not to buy masks and that they were ineffective during the pandemic.

Placing absolute trust in these organizations and restricting freedom of speech based on that is a very bootlicking, anti-freedom stance

anonymousiam 5 hours ago | parent | next [-]

Fauci was trying to prevent a run on masks, which he believed were needed by the health care workers. So he probably justified his lie to the US to himself because it was for the "greater good" (The ends justify the means is not my view BTW).

It turns out that masks ARE largely ineffective at preventing CoViD infection. It's amazing how many studies have come up with vastly different results.

https://egc.yale.edu/research/largest-study-masks-and-covid-...

(Before you tell me that the story I cited above says the opposite, look at the effectiveness percentages they claim for each case.)

There's also this: https://x.com/RandPaul/status/1970565993169588579

dotnet00 4 hours ago | parent | next [-]

They claim a 5% reduction in spread with cloth masks and a 12% reduction with surgical masks. I think 1 less case out of every 10 or 20 is pretty acceptable?

Especially at the time when many countries were having their healthcare systems overloaded by cases.

lisbbb 4 hours ago | parent | prev [-]

I didn't want to be the one to have to say it, but neither masks nor social distancing had any scientific backing at all. It was all made up, completely made up. The saddest thing I see all the time is the poor souls STILL wearing masks in 2025 for no reason. I don't care how immunocompromised they are, the mask isn't doing anything to prevent viral infection at all. They might help against pollen. I also can't believe how many doctors and nurses at my wife's cancer clinic wear masks all the damn time even though they are not in a surgical enviornment. It's all been foisted upon them by the management of those clinics and the management is completely insane and nobody speaks up about it because it's their job if they do, so the isanity just keeps rolling on and on and it is utterly dehumanizing and demoralizing. If a cancer patient wants to wear a mask because it affords them some tiny comfort, then fine, but that is purely psychological. I've seen it over and over and over because I've been at numerous hospitals this past year trying to help my wife survive a cancer that I think Pfizer may be to blame for.

D-Machine 4 hours ago | parent | next [-]

Basically, yes. However, if we make a distinction between respirators (e.g. N95 mask) and masks (including "surgical" masks, which don't really have a meaningfully better FFE than cloth masks), then at least respirators offer some protection to the wearer, provided they also still minimize contact. But, in keeping with this distinction, yes, masks were never seriously scientifically supported. It is incredibly disheartening to see mask mandates still in cancer wards, despite these being mandates for (objectively useless) cloth/surgical masks.

jbm 4 hours ago | parent | prev [-]

I'm sorry about your wife.

There was scientific basis for N95 masks and similar masks. If you are talking about cloth and paper masks, I mostly agree. Even then there were tests done with using even those surgical masks with 3d printed frames. I remember this as one example of people following this line of thinking.

https://www.concordia.ca/news/stories/2021/07/26/surgical-ma...

As for dehumanization, I used to live in Tokyo and spending years riding the train. I think blaming masks for dehumanization when we have entire systems ragebaiting us on a daily basis is like blaming the LED light for your electric bill.

Social Distancing having "no scientific backing" is very difficult to respond to. Do you mean in terms of long term reduction of spread, or as a temporary measure to prevent overwhelming the hospitals (which is what the concern was at the time)?

I do agree that it was fundamentally dishonest to block people from going to church and then telling other people it was OK to protest (because somehow these protests were "socially distanced" and outdoors). They could have applied the same logic to Church groups and helped them find places to congregate, but it was clearly a case of having sympathy for the in-group vs the out-group.

amanaplanacanal 7 hours ago | parent | prev [-]

Yeah they burned a lot of trust with that, for sure.

lisbbb 4 hours ago | parent [-]

They burned it beyond down to the ground and below. And many of you on here willfully continue to trust them and argue vehemently against people who try to tell you the actual truth of the matter. RFK Jr. is a flawed human being, but he's doing some good work in unwinding some of the web of lies we live under right now.

aeternum 3 hours ago | parent | next [-]

It's good RFK is more willing to question things but he seems just as guilty when it comes to spinning webs of lies.

If we think tylenol might cause autism why doesn't he run/fund a nice clean and large randomized controlled trial? Instead he spreads conjecture based on papers with extremely weak evidence.

alphabettsy 4 hours ago | parent | prev [-]

He’s just bringing different lies with new sponsors.

sterlind 8 hours ago | parent | prev [-]

it was an extreme time, but yes, probably the most authoritarian action I've seen social media take.

misinformation is a real and worsening problem, but censorship makes conspiracies flourish, and establishes platforms as arbiters of truth. that "truth" will shift with the political tides.

IMO we need to teach kids how to identify misinformation in school. maybe by creating fake articles, mixing them with real articles and having students track down sources and identify flaws. critical thinking lessons.

YZF 4 hours ago | parent | next [-]

This just seems incredibly difficult. Even between people who are highly intelligent, educated, and consider themselves to be critical thinkers there can be a huge divergence of what "truth" is on many topics. Most people have no tools to evaluate various claims and it's not something you can just "teach kids". Not saying education can't move the needle but the forces we're fighting need a lot more than that.

I think some accountability for platforms is an important part of this. Platforms right now have the wrong incentives, we need to fix this. It's not just about "truth" but it's also about stealing our attention and time. It's a drug and we should regulate it like the drug it is.

adiabatichottub 8 hours ago | parent | prev | next [-]

As I recall from my school days, in Social Studies class there were a set of "Critical Thinking" questions at the end of every chapter in the textbook. Never once were we assigned any of those questions.

tbrownaw 5 hours ago | parent [-]

I'd expect questions with that label to have the sort of answers that are a pain to grade.

blooalien an hour ago | parent | prev | next [-]

> "IMO we need to teach kids how to identify misinformation in school. maybe by creating fake articles, mixing them with real articles and having students track down sources and identify flaws. critical thinking lessons."

You just described a perfectly normal "Civics & Current Events" class in early grade-school back when / where I grew up. We were also taught how to "follow the facts back to the actual sources" and other such proper research skills. This was way back when you had to go to an actual library and look up archived newspapers on microfiche, and encyclopedias were large collections of paper books. Y'know... When dinosaurs still roamed the streets... ;)

Aurornis 3 hours ago | parent | prev | next [-]

> IMO we need to teach kids how to identify misinformation in school.

This is extremely difficult. Many of the people who thrive on disinformation are drawn to it because they are contrarian. They distrust anything from the establishment and automatically trust anything that appears anti-establishment. If you tell them not to trust certain sources that’s actually a cue to them to explore those sources more and assume they’re holding some valuable information that “they” don’t want you to know.

The dynamics of this are very strange. A cluster of younger guys I know can list a dozen different times medical guidance was wrong in history from memory (Thalidomide, etc), but when you fact check Joe Rogan they laugh at you because he’s a comedian so you can’t expect him to be right about everything. “Do your own research” is the key phrase, which is a dog whistle to mean find some info to discount the professionals but then take sources like Joe Rogan and his guests at face value because they’re not the establishment.

tjpnz 7 hours ago | parent | prev [-]

Some of the worst examples of viral misinformation I've encountered were image posts on social media. They'll often include a graph, a bit of text and links to dense articles from medical journals. Most people will give up at that point and assume that it's legit because the citations point to BMJ et el. You actually need to type those URLs into a browser by hand, and assuming they go anywhere leverage knowledge taught while studying university level stats.

I spent several hours on one of these only to discover the author of the post had found a subtle way to misrepresent the findings and had done things to the graph to skew it further. You cannot expect a kid (let alone most adults) to come to the same conclusion through lessons on critical thinking.

softwaredoug 10 hours ago | parent | prev | next [-]

I'm very pro-vaccines, I don't think the 2020 election was stolen. But I think we have to realize silencing people doesn't work. It just causes the ideas to metastasize. A lot of people will say all kinds of craziness, and you just have to let it ride so most of us can roll our eyes at it.

homeonthemtn 8 hours ago | parent | next [-]

You are on a platform that polices speech. It is evidence that policing speech helps establish civility and culture. There's nothing wrong with policing speech, but it can certainly be abused.

If you were on the early Internet, you were self policing with the help of admins all the time. The difference was you had niche populations that had a stake in keeping the peace and culture of a given board

We broke those boundaries down though and now pit strangers versus strangers for clicks and views, resulting in daily stochastic terrorism.

Police the damn speech.

softwaredoug 7 hours ago | parent | next [-]

For inciting violence. Sure. Free speech isn’t absolute.

But along with fringe Covid ideas, we limited actual speech on legitimate areas of public discourse around Covid. Like school reopening or questioning masks and social distancing.

We needed those debates. Because the unchecked “trust the experts” makes the experts dumber. The experts need to respond to challenges.

(And I believe those experts actually did about as best they could given the circumstances)

scuff3d 8 minutes ago | parent | next [-]

Try to post a meme here, see how long it stays up.

More seriously, it's just not this simple man. I know people really want it to be, but it's not.

I watched my dad get sucked down a rabbit hole of qanon, Alex Jones, anti-vax nonsense and God knows what other conspiracy theories. I showed him point blank evidence that qanon was bullshit, and he just flat out refuses to believe it. He's representative of a not insignificant part of the population. And you can say it doesn't do any damage, but those people vote, and I think we can see clearly it's done serious damage.

When bonkers ass fringe nonsense with no basis in reality gets platformed, and people end up in that echo chamber, it does significant damage to the public discourse. And a lot of it is geared specifically to funnel people in.

In more mainstream media climate change is a perfect example. The overwhelming majority in the scientific community has known for a long time it's an issue. There were disagreement over cause or severity, but not that it was a problem. The media elevated dissenting opinions and gave the impression that it was somehow an even split. That the people who disagree with climate change were as numerous and as well informed, which they most certainly weren't, not by a long shot. And that's done irreparable damage to society.

Obviously these are very fine lines to be walked, but even throughout US history, a country where free speech is probably more valued than anywhere else on the planet, we have accepted certain limitations for the public good.

homeonthemtn 7 hours ago | parent | prev | next [-]

If I were trying to govern during a generational, world stopping epoch event, I would also not waste time picking through the trash to hear opinions.

I would put my trust in the people I knew were trained for this and adjust from there.

I suspect many of these opinions are born from hindsight.

xboxnolifes 6 hours ago | parent | next [-]

Letting fringe theories exist on YouTube does not stop you from accessing the WHO or CDC website.

themaninthedark 3 hours ago | parent | prev | next [-]

Luckily, it is possible for you to just listen to those you trust. No need for you go pick through other people's opinions.

I don't see how that turns into you needing to mandate what I read and who's opinions I hear.

scuff3d a few seconds ago | parent [-]

There has been a massive uptick in anti-vax rhetoric over the last decade. As a result some Americans have decided to not vaccinate, and we are seeing a resurgence in diseases that should be eradicated.

I have a three month old son. At the time he was being born, in my city, there was an outbreak of one of those diseases that killed more then one kid. Don't tell me this stuff doesn't have a direct impact on people.

zmgsabst 7 hours ago | parent | prev [-]

Really?

Experts have a worse track record than open debate and the COVID censorship was directed at even experts who didn’t adhere to political choices — so to my eyes, you’re saying that you’d give in to authoritarian impulses and do worse.

judahmeek 6 hours ago | parent [-]

The problem with debate is that it hinders organized action.

At some point in any emergency, organized action has to be prioritized over debate.

Maybe that is still authoritarian, but they do say to have moderation in all things!

Gud 3 hours ago | parent | next [-]

No it doesn't. It allows for correct action to be taken.

aianus 3 hours ago | parent | prev | next [-]

God forbid someone hinder some retarded organized action before enough peoples’ lives are ruined that our majestic rulers notice and gracefully decide to stop.

zmgsabst 4 hours ago | parent | prev | next [-]

That’s not at all how you’re taught to handle emergencies.

From health emergencies to shootings to computer system crashes to pandemics — doing things without a reason to believe they’ll improve the situation is dangerous. You can and many have made things worse. And ignoring experts shouting “wait, no!” is a recipe for disaster.

When we were responding to COVID, we had plenty of time to have that debate in a candid way. We just went down an authoritarian path instead.

SV_BubbleTime 4 hours ago | parent | prev [-]

> The problem with debate is that it hinders organized action.

Ah… so… ”we must do something! Even if it’s the wrong thing”

Hot take.

epistasis 6 hours ago | parent | prev | next [-]

Really, discussion was limited? Or blatant lies were rightly excluded from discourse?

There's a big difference, and in any healthy public discourse there are severe reputations penalties for lies.

If school reopening couldn't be discussed, could you point to that?

It's very odd how as time goes on my recollection differs so much from others, and I'm not sure if it's because of actual different experiences or because of the fog of memory.

mixmastamyk 5 hours ago | parent [-]

Blatant truths were excluded as well, and that's the main problem. See replies to: https://news.ycombinator.com/item?id=45353884

epistasis 4 hours ago | parent [-]

That's a really long thread and I'm not sure where blatant truths were excluded.

McGlockenshire 7 hours ago | parent | prev [-]

The "debate" ended up doing nothing but spreading misinformation.

Society as a whole has a responsibility to not do that kind of shit. We shouldn't be encouraging the spread of lies.

jader201 5 hours ago | parent | prev | next [-]

> Police the damn speech.

What happens when the “police” disagrees with and silences what you believe is true? Or when they allow the propagation of what you believe to be lies?

Who gets to decide what’s the truth vs. lies? The “police”?

palmfacehn 4 hours ago | parent [-]

>Who gets to decide what’s the truth vs. lies? The “police”?

This keeps coming up on this site. It seems like a basic premise for a nuanced and compassionate worldview. Humility is required. Even if we assume the best intentions, the fallible nature of man places limits on what we can do.

Yet we keep seeing posters appealing to Scientism and "objective truth". I'm not sure it is possible to have a reasonable discussion where basic premises diverge. It is clear how these themes have been used in history to support some of the worst atrocities.

frollogaston 6 hours ago | parent | prev | next [-]

Depends who is doing the policing. In this case, White House was telling Google who to ban.

aeternum 3 hours ago | parent [-]

I think it was even slightly worse. The White House was effectively delegating the decision of who to ban/police to the NIH/NIAID, an organization that was funding novel coronavirus research in Wuhan.

It's easy to see how at minimum there could be a conflict of interest.

StanislavPetrov 6 hours ago | parent | prev | next [-]

Policing speech for civility or spam is very different than policing speech for content that you disagree with. I was on the early internet, and on the vast majority of forums policing someone's speech for content rather than vulgarity or spam was almost universally opposed and frowned upon.

nostromo 7 hours ago | parent | prev | next [-]

You've missed the point entirely.

It’s not if Google can decide what content they want on YouTube.

The issue here is that the Biden Whitehouse was pressuring private companies to remove speech that they otherwise would host.

That's a clear violation of the first amendment. And we now know that the previous Whitehouse got people banned from all the major platforms: Twitter, YouTube, Facebook, etc.

dotnet00 4 hours ago | parent | next [-]

They claim that the Biden admin pressured them to do it, except that they had been voluntarily doing it even during Trump's initial presidency.

The current administration has been openly threatening companies over anything and everything they don't like, it isn't surprising all of the tech companies are claiming they actually support the first amendment and were forced by one of the current administration's favorite scapegoats to censor things.

homeonthemtn 7 hours ago | parent | prev [-]

If you can't be trusted to police your self, then it's a natural result that others will do it for you.

nostromo 7 hours ago | parent [-]

Thankfully the constitution explicitly forbids that in the US.

homeonthemtn 7 hours ago | parent [-]

Evidently it does not because it happens all the time. See: Jimmy Kimmel 2 nights ago as your most recent example.

themaninthedark 7 hours ago | parent | next [-]

Huh, last I heard was that Jimmy Kimmel is back on air.

If the Trump administration had decided to follow through with their threats, ABC could have sued and won.

Lastly, Jimmy Kimmel could have(and still possibly might be able to) sue for tortious interference.

abracadaniel 6 hours ago | parent [-]

Nexstar and Sinclair are still blocking their stations from airing him which accounts for a quarter of the US.

frollogaston 6 hours ago | parent [-]

They're private companies. If the reason they're doing this is govt pressure (FCC licenses?), that's not ok though.

SV_BubbleTime 4 hours ago | parent | prev | next [-]

Kimmel intentionally spread a lie to his audiences for political gain. Turns out that is against the terms of ABC’s FCC broadcast license.

They asked him to apologize and he refused, so they suspended him.

mensetmanusman 7 hours ago | parent | prev [-]

That was abc, and they just put him back.

zmgsabst 7 hours ago | parent | prev [-]

That causes problems on this board too.

Eg, even though completely factual and documented including the posts having citations, pointing out that BLM is Marxist in political orientation and that their violence against Asian and black small businesses doesn’t make sense through a racial justice lens but does through a Marxist revolutionary lens.

Censorship of that topic due to feelings shut down honest discussion about the largest organized political violence the US has seen in decades.

As censorship always does.

z0r 6 hours ago | parent | next [-]

There is no mass Marxist movement in the USA. There is a left wing crippled by worse than useless identity politics.

homeonthemtn 7 hours ago | parent | prev [-]

What on earth...

andy99 10 hours ago | parent | prev | next [-]

The more important point (and this is really like a high school civics debate) is that the government and/or a big tech company shouldn't decide what people are "allowed" to say. There's tons of dumb stuff online, the only thing dumber is the state dictating how I'm supposed to think. People seem to forget that sometimes someone they don't agree with is in power. What if they started banning tylenol-autism sceptical accounts?

mapontosevenths 10 hours ago | parent | next [-]

> the government and/or a big tech company shouldn't decide what people are "allowed" to say.

That "and/or" is doing a lot of work here. There's a huge difference between government censorship and forcing private companies to host content they don't want to host on servers they own.

Then again, Alphabet is now claiming they did want to host it and mean old Biden pressured them into pulling it so if we buy that, maybe it doesn't matter.

> What if they started banning tylenol-autism sceptical accounts?

What if it's pro-cannibalism or pedophilia content? Everyone has a line, we're all just arguing about where exactly we think that line should be.

int_19h 8 hours ago | parent | next [-]

> There's a huge difference between government censorship and forcing private companies to host content they don't want to host on servers they own.

It really depends. I remember after the Christchurch mosque shootings, there was a scramble to block the distribution of the shooter's manifesto. In some countries, the government could declare the content illegal directly, but in others, such as Australia, they didn't have pre-existing laws sufficiently wide to cover that, and so what happened in practice is that ISPs "proactively" formed a voluntary censorship cartel, acting in lockstep to block access to all copies of the manifesto, while the government was working on the new laws. If the practical end result is the same - a complete country block on some content - does it really matter whether it's dressed up as public or private censorship?

And with large tech companies like Alphabet and Meta, it is a particularly pointed question given how much the market is monopolized.

onecommentman 4 hours ago | parent [-]

I wonder, in the case of mass violence events that were used as advertisement for the (assumed) murderer’s POV, whether there should be an equivalent of a House of Lords for the exceptional situation of censoring what in any other context would be breaking news. You don’t want or need (or be able) to censor a manifesto for all time, but you would want to prevent the (assumed) murderers from gaining any momentum from their heinous acts. So a ninety day (but only 90 day) embargo on public speech from bad actors, with the teeth of governmental enforcement, sounds pretty reasonable to me. Even cleverer to salt the ether with “leaks” that would actively suppress any political momentum for the (presumed) murderers during the embargo period, but with the true light of day shining after three months.

int_19h 2 hours ago | parent [-]

It doesn't sound reasonable to me tbh. If anything, reading those manifestos is a good way to learn just how nutty those people are in the first place. At the same time, having it accessible prevents speculation about motives, which can lead to false justification for politically oppressive measures.

OTOH if the goal is to prevent copycats then I don't see the point of a 90-day embargo. People who are likely to take that kind of content seriously enough to emulate are still going to do so. Tarrant, for example, specifically referenced Anders Breivik.

MostlyStable 9 hours ago | parent | prev | next [-]

It can simultaneously be legal/allowable for them to ban speech, and yet also the case that we should criticize them for doing so. The first amendment only restricts the government, but a culture of free speech will also criticize private entities for taking censorious actions. And a culture of free speech is necessary to make sure that the first amendment is not eventually eroded away to nothing.

plantwallshoe 9 hours ago | parent | next [-]

Isn’t promoting/removing opinions you care about a form of speech?

If I choose to put a Kamala sign in my yard and not a Trump sign, that’s an expression of free speech.

If the marketing company I own decides to not work for causes I don’t personally support, that’s free speech.

If the video hosting platform I’m CEO of doesn’t host unfounded anti-vax content because I think it’s a bad business move, is that not also free speech?

AfterHIA 7 hours ago | parent | next [-]

The crux of this is a shift in context (φρόνησις) where-in entities like marketing companies or video hosting platforms are treated like moral agents which act in the same manner as individuals. We can overcome this dilemma by clarifying that generally, "individuals with the power to direct or control the speech of others run the risk of gross oppression by being more liberal with a right to control or stifle rather than erring on the side of propagating a culture of free expression whether this power is derived from legitimate political ascension or the concentration of capital."

In short-- no. Your right is to positively assert, "Trump sign" not, "excludes all other signs as a comparative right" even though this is a practical consequence of supporting one candidate and not others. "Owning a marketing company" means that you most hold to industrial and businesss ethics in order to do business in a common economic space. Being the CEO of any company that serves the democratic public means that one's ethical obligations must reflect the democratic sentiment of the public. It used to be that, "capitalism" or, "economic liberalism" meant that the dollars and eyeballs would go elsewhere as a basic bottom line for the realization of the ethical sentiment of the nation-state. This becomes less likely under conditions of monopoly and autocracy. The truth is that Section 230 created a nightmare. If internet platforms are now ubiquitous and well-developed aren't the protections realized under S230 now obsolete?

It would be neat if somebody did, "you can put any sign in my yard to promote any political cause unless it is specifically X/Trump/whatever." That would constitute a unique form of exclusionary free speech.

plantwallshoe 5 hours ago | parent [-]

> Being the CEO of any company that serves the democratic public means that one's ethical obligations must reflect the democratic sentiment of the public.

How does one determine the democratic sentiment of the public, especially a public that is pretty evenly ideologically split? Seems fraught with personal interpretation (which is arguably another form of free speech.)

AfterHIA 4 hours ago | parent [-]

Let's think pragmatically and think of, "democracy" as a way of living which seeks to maximize human felicity and minimize human cruelty. In a fair society there would be/is a consensus that at a basic level our social contract is legitimized by these commitments to that. The issue stems from splitting hairs about what human felicity constitutes. This can be resolved as recognizing that some dignified splitting of these hairs is a necessary component of that felicity. This presents in our society as the public discourse and the contingent but distinct values of communities in their efforts to realize themselves.

I'm reminded of that old line by Tolstoy-- something like, "happy families are all happy for precisely the same reasons; every unhappy family is unhappy in its own way." The point from an Adam Smith perspective is that healthy societies might all end up tending toward the same end by widely different means: Chinese communists might achieve superior cooperation and the realization of their values as, "the good life" by means dissimilar to the Quaker or the African tribesperson. The trick is seeing that the plurality of living forms and their competing values is not a hinderance to cooperation and mutual well-being but an opportunity for extended and renewed discourses about, "what we would like to be as creatures."

Worth mentioning:

https://sites.pitt.edu/~rbrandom/Courses/Antirepresentationa...

lmz 8 hours ago | parent | prev [-]

Agreed. If I have a TV network and think these anti-government hosts on my network are bad for business, that is also freedom of speech.

rubyfan 8 hours ago | parent | next [-]

Maybe. If it is independent of government coercion.

Jensson 8 hours ago | parent [-]

But Youtube did this after government coercion, so what is the difference?

alphabettsy 4 hours ago | parent [-]

I think you should look up the definition of coercion.

Jensson 44 minutes ago | parent [-]

Have you seen the emails the Biden Administration sent to Youtube? Here is a quote verbatim that they sent to Youtube:

> we want to be sure that you have a handle on vaccine hesitancy generally and are working toward making the problem better. This is a concern that is shared at the highest (and I mean highest) levels of the White House

Saying you want to make sure they will censor these videos is a threat, and then they said that Biden was behind this to add legitimacy to the threat.

If it was just a friendly greeting why would they threaten youtube with Bidens name? If youtube did this willingly there would be no need to write such a threatening message saying they want to make sure Youtube censors these.

You can read the whole report here if you wanna see more: https://judiciary.house.gov/sites/evo-subsites/republicans-j...

And if you don't see that as a threat, imagine someone in the trump administration sent that, do you still think its not a threat? Of course its a threat, it makes no sense to write that way otherwise, you would just say you wanted to hear how it goes not say you wanna make sure they do this specific thing and threaten them with the presidents powers.

crtasm 8 hours ago | parent | prev | next [-]

I hope to see the anti-government hosts before they're let go. The channels I've tried so far only seem to have boring old anti-corruption, anti-abuse of power and anti-treating groups of people as less than human hosts.

AfterHIA 7 hours ago | parent | prev [-]

You use terms (other as well) like, "own, is the CEO of, and the owner of" and this speaks to the ironically illiberal shift we've seen in contemporary politics. Historically one needed to justify, "why" some person is put into a position of authority or power-- now as a result of the Randroid Neoliberal Assault™ it's taken for granted that if, "John Galt assumed a position of power that he has a right to exercise his personal will even at the behest of who he serves or at the behest of ethics" as an extension of, "the rights of the individual."

I want to recapitulate this sentiment as often and as widely as possible-- Rand and her cronies know as much about virtue, freedom, and Aristotle as they do about fornicating; not much.

lkey 9 hours ago | parent | prev | next [-]

Or it might be the case that that 'culture' is eroding the thing it claims to be protecting. https://www.popehat.com/p/how-free-speech-culture-is-killing...

AfterHIA 7 hours ago | parent [-]

This. Even if we have concrete protections in our society it takes a society of people committed to a common democratic cause and common functional prosperity that prevents there from being abuses of the right to speak and so on (..) This isn't complicated and this wasn't always controversial.

I've already described above that even in this thread there's a sentiment which is that, "as long as somebody has gained coercive power legitimately then it is within their right to coerce." I see terms thrown around like, "if somebody owns" or, "if somebody is the CEO of..." which speaks to the growing air of illiberality an liberal autocranarianism which is a direct result of the neoliberal assault founding and funding thousands of Cato Institutes, Adam Smith Societies, and Heritage Foundations since the neoliberal turn in the late 1960's. We've legitimized domination ethics as an extension of the hungry rights of pseudotyrants and the expense of people in general.

I wonder what people in general might one day do about this? I wonder if there's a historical precedent for what happens when people face oppression and the degradation of common cultural projects?

https://en.wikipedia.org/wiki/Russian_Revolution#October_Rev...

https://en.wikipedia.org/wiki/Reign_of_Terror

AfterHIA 7 hours ago | parent | prev | next [-]

Bingo. This is Adam Smith's whole point in the second half of, "Wealth Of Nations" that nobody bothers to read in lieu of the sentiments of the Cato Institute and the various Adam Smith societies. Nations produce, "kinds of people" that based on their experience of a common liberty and prosperity will err against tyranny. Economics and autocracy in our country is destroying our culture of, "talk and liberality." Discourse has become, "let's take turns attacking each other and each other's positions."

The American civilization has deep flaws but has historically worked toward, "doing what was right."

https://www.adamsmithworks.org/documents/book-v-of-the-reven...

SantalBlush 8 hours ago | parent | prev | next [-]

Are you in favor of HN allowing advertisements, shilling, or spam in these threads? Because those things are free speech. Would you like to allow comments about generic ED pills?

I simply don't believe people who say they want to support a culture of free speech on a media or social media site. They haven't really thought about what that means.

AfterHIA 6 hours ago | parent [-]

Without being crude I think they stopped, "thinking about that means" in any positive sense a long time ago. Cultures of discourse and criticism are never good for the powerful. The goal is to create a culture when anyone can say anything but with no meaningful social consequences negative or positive. I can call Trump a pedophile all day on my computer interface and maybe somebody else will see it but the Google and Meta machine just treat it as another engagement dollar. These dollars are now literally flowing to the White House in the form of investment commitments by acting Tech Czar Zuckerberg.

While I'm with my dudes in computer space-- it all starts with the passing of the Mansfield Amendment. You want to know why tech sucks and we haven't made any foundational breakthroughs for decades? The privatization of technology innovation.

https://en.wikipedia.org/wiki/Pirates_of_Silicon_Valley

https://www.nsf.gov/about/history/narrative#chapter-iv-tumul...

asadotzler 8 hours ago | parent | prev [-]

Will you criticize my book publishing company for not publishing and distributing your smut short story?

user34283 8 hours ago | parent | next [-]

No, but I will criticize Apple and Google for banning smut apps.

If those two private companies would host all legal content, this could be a thriving market.

Somehow big tech and payment processors get to censor most software.

AfterHIA 6 hours ago | parent | prev [-]

Perhaps and if you have some kind of monopoly than definitely. Things beings, "yours" isn't some fundamental part of the human condition. CEOs serve their employees and shareholders and the ethics of the business space they operate in. Owners are ethically obligated to engage in fair business practices. I'm sick up to my neck of this sentiment that if John Galt is holding a gun he necessarily has the right to shoot it at somebody.

Modern democracies aren't founded on realist ethics or absolute commitments to economic liberalism as totalizing-- they're founded on a ethical balance between the real needs of people, the real potential for capital expansion, and superior sentiments about the possibilities of the human condition. As a kid that supported Ron Paul's bid for the Republican nomination as a 16-year-old I can't help but feel that libertarian politics has ruined generations of people by getting them to accept autocracy as, "one ethical outcome to a free society." It isn't.

The irony in me posting this will be lost on most: https://www.uschamber.com/

mitthrowaway2 7 hours ago | parent | prev | next [-]

The middle ground is when a company becomes a utility. The power company can't simply disconnect your electricity because they don't feel like offering it to you, even though they own the power lines. The phone company can't disconnect your call because they disagree with what you're saying, even though they own the transmission equipment.

briHass 9 hours ago | parent | prev | next [-]

The line should be what is illegal, which, at least in the US, is fairly permissive.

The legal process already did all the hard work of reaching consensus/compromise on where that line is, so just use that. At least with the legal system, there's some degree of visibility and influence possible by everyone. It's not some ethics department silently banning users they don't agree with.

AfterHIA 7 hours ago | parent | prev | next [-]

There's a literal world of literature both contemporary and classical which points to the idea that concentrations of power in politics and concentrations of wealth and power in industry aren't dissimilar. I think there are limits to this as recent commentaries by guys like Zizek seem to suggest that the, "strong Nation-State" is a positive legacy of the European enlightenment. I think this is true, "when it is."

Power is power. Wealth is power. Political power is power. The powerful should not control the lives or destinies of the less powerful. This is the most basic description of contemporary democracy but becomes controversial when the Randroids and Commies alike start to split hairs about how the Lenins and John Galts of the world have a right to use power to further their respective political objectives.

https://www.gutenberg.org/files/3207/3207-h/3207-h.htm (Leviathan by Hobbes)

https://www.gutenberg.org/ebooks/50922 (Perpetual Peace by Kant)

https://www.heritage-history.com/site/hclass/secret_societie...

mc32 9 hours ago | parent | prev [-]

The thing is that people will tell you it wasn’t actually censorship because for them it was only the government being a busy body nosey government telling the tech corps about a select number of people violating their terms (nudge nudge please do something)… so I think the and/or is important.

AfterHIA 6 hours ago | parent [-]

Great post mc32 (I hope you're a Wayne Kramer fan!)

This private-public tyranny that's going on right now. The FCC can't directly tell Kimmel, "you can't say that" they can say, "you may have violated this or this technical rule which..." This is how Project 2025 will play out in terms of people's real experience. You occupy all posts with ideologically sympathetic players and the liberality people are used to becomes ruinous as, "the watchers" are now, "watching for you." The irony is that most conservatives believe this is just, "what the left was doing in the 2010's in reverse" and I don't have a counterargument for this other than, "it doesn't matter; it's always bad and unethical." Real differences between Colbert and Tate taken for granted.

JumpCrisscross 9 hours ago | parent | prev | next [-]

> the government and/or a big tech company shouldn't decide what people are "allowed" to say

This throws out spam and fraud filters, both of which are content-based moderation.

Nobody moderates anything isn’t unfortunately a functional option. Particularly if the company has to sell ads.

ncallaway 7 hours ago | parent | prev | next [-]

As with others, I think your "and/or" between government and "big tech" is problematic.

I think government censorship should be strictly prohibited. I think "company" censorship is just the application of the first amendment.

Where I think the problem lies with things like YouTube is the fact that we have _monopolies_, so there is no "free market" of platforms.

I think we should be addressing "big tech" censorship not by requiring tech companies to behave like a government, but rather by preventing any companies from having so much individual power that we _need_ them to behave like a government.

We should have aggressive anti-trust laws, and interoperability requirements for large platforms, such that it doesn't matter if YouTube decides to be censorious, because there are 15 other platforms that people can viably use instead.

AfterHIA 8 hours ago | parent | prev | next [-]

Another way of articulating this: "concentrations of power and wealth should not determine the speech or political sentiments of the many."

My fear is that this is incredibly uncontroversial this is until it's not-- when pushes becomes shoves we start having debates about what are, "legitimate" concentrations of power (wealth) and how that legitimacy in itself lets us, "tolerate what we would generally condemn as intolerable." I feel we need to take a queue from the Chomsky's of the world and decree:

"all unjustified concentrations of power and wealth are necessarily interested in control and as such we should aggressively and purposefully refuse to tolerate them at all as a basic condition of democratic living..."

This used to be, "social democracy" where these days the Democratic Party in the United States' motto is more, "let us make deals with the devil because reasons and things." People have the power. We are the people. Hare fucking Krsna.

heavyset_go 8 hours ago | parent | prev | next [-]

This is just a reminder that we're both posting on one the most heavily censored, big tech-sponsored spaces on the internet, and arguably, that's what allows for you to have your civics debate in earnest.

What you are arguing for is a dissolution of HN and sites like it.

asadotzler 8 hours ago | parent | prev | next [-]

No one in Big Tech decides what you are allowed to say, they can only withhold their distribution of what you say.

As a book publisher, should I be required to publish your furry smut short stories? Of course not. Is that infringing on your freedom of speech? Of course not.

AfterHIA 6 hours ago | parent | next [-]

If the furry smut people became the dominant force in literature and your company was driven out of business fairly for not producing enough furry smut would that too constitute censorship?

I want to see how steep this hill you're willing to die on is. What's that old saying-- that thing about the shoe being on the other foot?

mitthrowaway2 8 hours ago | parent | prev [-]

No, they ban your account and exclude you from the market commons if they don't like what you say.

mulmen 8 hours ago | parent [-]

Yes that’s how free markets work. Your idea has to be free to die in obscurity.

Compelled speech is not free speech. You have no right to an audience. The existence of a wide distribution platform does not grant you a right to it.

These arguments fall completely flat because it’s always about the right to distribute misinformation. It’s never about posting porn or war crimes or spam. That kind of curation isn’t contentious.

Google didn’t suddenly see the light and become free speech absolutists. They caved to political pressure and are selectively allowing the preferred misinformation of the current administration.

int_19h 8 hours ago | parent | next [-]

A market that has companies with the size - or rather, the market dominance - of the likes of Google is not meaningfully a free market. The fundamental problem isn't whether Google censors or not, nor what it censors, but the very fact that its decision on this matter is so impactful.

mulmen 6 hours ago | parent [-]

If you want to debate anti trust and regulation then let’s do it. Google’s dominance is bad for our society, culture, and our economy but it’s not a reason to erode our fundamental rights. Compelling free speech will do nothing to erode Google’s market share or encourage competition. In fact it will further entrench Google’s dominance.

int_19h 2 hours ago | parent | next [-]

You're right, but freedom of speech is also a valid angle from which to debate antitrust and regulation. Indeed, I don't want Google to be compelled to platform others - I want platforms that large to not exist in the first place. But pointing out that censorship by big tech megacorps has very real and very negative effects that can be comparable to outright government censorship in some cases is a part of that fight.

mulmen an hour ago | parent [-]

> You're right, but freedom of speech is also a valid angle from which to debate antitrust and regulation.

The effect of YouTube’s content moderation size on speech is a symptom of weak antitrust policy, not of free expression. So sure, mention the effect on speech if you want but don’t ignore the solution.

Dylan16807 5 hours ago | parent | prev [-]

How is compelling google to censor less going to entrench their dominance? If it's purely by making them suck less, I'm okay with that risk.

And I don't think it erodes any fundamental rights to put restrictions on huge monopolies.

mulmen 4 hours ago | parent [-]

> How is compelling google to censor less going to entrench their dominance?

If you force Google alone to amplify certain speech then what competitive advantage does a less censorious service provide?

> If it's purely by making them suck less, I'm okay with that risk.

Define “suck less”. Now ask yourself if you are comfortable with someone you completely disagree with defining what sucks less.

> And I don't think it erodes any fundamental rights to put restrictions on huge monopolies.

You’re talking about antitrust, not free expression.

Compelled speech is an erosion of the first amendment. You may think that erosion is acceptable but you can’t deny it exists.

Dylan16807 3 hours ago | parent [-]

> If you force Google alone to amplify certain speech then what competitive advantage does a less censorious service provide?

If that's the only "advantage" another service has, I don't care if it has no competitive advantage. If it offers anything else then that's the advantage.

Seriously this idea is super weird to me. There are plenty of reasons to avoid too much regulation. But "don't force company X to make their users happier because happy users won't leave" is a terrible reason.

>Define “suck less”. Now ask yourself if you are comfortable with someone you completely disagree with defining what sucks less.

A big part of the "if" is that people are making their own evaluations.

> You’re talking about antitrust

I am not talking about antitrust. I'm saying that the bigger and more powerful a corporation gets the further it is from a human and human rights.

> Compelled speech is an erosion of the first amendment. You may think that erosion is acceptable but you can’t deny it exists.

In this case, barely at all, and it's the same one we already have for common carriers.

mulmen 2 hours ago | parent [-]

> If that's the only "advantage" another service has, I don't care if it has no competitive advantage. If it offers anything else then that's the advantage.

The value proposition of a less censorious YouTube alternative is exactly that it is less censorious. You’re seemingly arguing against free markets.

> Seriously this idea is super weird to me. There are plenty of reasons to avoid too much regulation. But "don't force company X to make their users happier because happy users won't leave" is a terrible reason.

The problem with compelled speech is that the government should not be in the business of deciding what kind of speech makes people happy.

> A big part of the "if" is that people are making their own evaluations.

People should have the freedom to choose the media they consume. Compelled speech takes that choice away from them by putting the government in the position of making that decision for the people. This distorts the marketplace of ideas.

I don’t have time to read every comment or email or watch every video. Private content moderation is a value add and a form of expression. We need competition in that space, not government restriction.

> I am not talking about antitrust. I'm saying that the bigger and more powerful a corporation gets the further it is from a human and human rights.

If your problem with Google is how much influence they have then yes, you are talking about antitrust. That’s the regulatory mechanism by which excessive corporate influence can be restricted.

> In this case, barely at all, and it's the same one we already have for common carriers.

“A little” is still more than nothing which was your previous assertion. You may be comfortable with the rising temperature of our shared pot of water but I say it is a cause for concern.

themaninthedark 2 hours ago | parent | prev [-]

Just to split hairs here, as I do not think that a company should be forced to host content.

Hosting content is not giving someone an audience.

If I take my stool into the main square and stand on it, giving a speech about the evils of canned spinach. People pass by but no-one stops and listens(or not for long), I did not have an audience.

If I record the same thing and put it up on Youtube and the same reaction happens. I only get 5~10 views, Youtube is not giving me an audience. They are hosting the video, just like they do for many other videos that are uploaded everyday.

If Youtube suddenly starts pushing my video onto everyone's "Home", "Recommended " or whatever; then that would be them giving me an audience.

If the Big Spinach Canners find my video and ask Youtube to take it down, that is censorship.

mulmen an hour ago | parent [-]

> Hosting content is not giving someone an audience.

Yes, it is.

> If I take my stool into the main square and stand on it, giving a speech about the evils of canned spinach. People pass by but no-one stops and listens(or not for long), I did not have an audience.

Well, yes, you did. They are free to cheer, boo, or leave. YouTube is more like an open mic night. I reject the idea that it is a public space like a main square.

> If I record the same thing and put it up on Youtube and the same reaction happens. I only get 5~10 views, Youtube is not giving me an audience. They are hosting the video, just like they do for many other videos that are uploaded everyday.

I am lucky to have never worked in content moderation but I’m certain YouTube refuses or removes submissions every day. So while your spinach speech may survive there are many other videos that don’t.

> If Youtube suddenly starts pushing my video onto everyone's "Home", "Recommended " or whatever; then that would be them giving me an audience.

Being on YouTube at all is YouTube giving you an audience. Their recommendation algorithm is the value proposition of their product to consumers whose attention is the product sold to advertisers.

> If the Big Spinach Canners find my video and ask Youtube to take it down, that is censorship.

Perhaps in the strictest dictionary sense it is censorship but it is not censorship in a first amendment sense. This is a private business decision. You’re free to submit your video as an ad and pay Google directly for eyeballs. And they can still say no.

The only problem here is the size of YouTube relative to competitors. The fix there is antitrust, not erosion of civil liberties.

Consider the landscape that evolves in a post-YouTube environment with an eroded first amendment and without section 230 protections. Those protections are critical for innovation and free expression.

zetazzed 9 hours ago | parent | prev | next [-]

Does Disney have a positive obligation to show animal cruelty snuff films on Disney Plus? Or are they allowed to control what people say on their network? Does Roblox have to allow XXX games showing non-consensual sex acts on their site, or are they allowed to control what people say on their network? Can WebMD decide not to present articles claiming that homeopathy is the ultimate cure-all? Does X have to share a "trending" topic about the refusal to release the Epstein files?

The reason we ban government censorship is so that a private actor can always create their own conspiracy theory + snuff film site if they want, and other platforms are not obligated to carry content they find objectionable. Get really into Rumble or Truth Social or X if you would like a very different perspective from Youtube's.

AfterHIA 6 hours ago | parent [-]

Let's say that in the future that the dominant form of entertainment is X-rated animal snuff films for whatever reason. Would a lack of alternative content constitute an attack on your right to choose freely or speak? Given your ethical framework I'd have to say, "no" but even as your discursive opponent I would have to admit that if you as a person are adverse to, "X-rated furry smut" that I would sympathize with you as the oppressed if it meant your ability to live and communicate has been stifled or called into question. Oppression has many forms and many names. The Johnny Conservatarians want to reserve certain categories of cruelty as, "necessary" or, "permissable" by creating frameworks like, "everything is permitted just as long as some social condition is met..."

At the crux of things the libertarians and the non-psychos are just having a debate on when it's fair game to be unethical or cruel to others in the name of extending human freedom and human dignity. We've fallen so far from the tree.

mulmen 8 hours ago | parent | prev [-]

I have some ideas I want to post on your personal webpage but you have not given me access. Why are you censoring me?

AfterHIA 6 hours ago | parent | next [-]

I have a consortium of other website owners who refuse to crosslink your materials unless you put our banner on your site. Is this oppression? Oppression goes both ways, has many names, and takes many forms. Its most insidious form being the Oxford Comma.

mitthrowaway2 7 hours ago | parent | prev [-]

Is andy99's personal webpage a de-facto commons where the public congregates to share and exchange ideas?

AfterHIA 6 hours ago | parent | next [-]

I know that your post is rhetorical but I'll extend your thinking into real life-- has andy99 personal webpage been created because you're an elected official representing others? Would this still give andy99 the right to distribute hate speech on his personal webpage? I think we can harmonize around, "unfortunately so" and that's why I think the way forward is concentrating on the, "unfortunately" and not the, "so."

We have the right to do a potentially limitless amount of unbecoming, cruel, and oppressive things to our fellow man. We also have the potential for forming and proliferating societies. We invented religion and agriculture out of dirt and need. Let us choose Nazarenes, Jeffersons, and Socrates' over Neros, Alexanders, and Napoleons. This didn't use to be politically controversial!

mulmen 4 hours ago | parent | prev [-]

It would be if they’d stop censoring me!

asadotzler 8 hours ago | parent | prev | next [-]

My refusing to distribute your work is not "silencing." Silencing would be me preventing you from distributing it.

Have we all lost the ability to reason? Seriously, this isn't hard. No one owes you distribution unless you have a contract saying otherwise.

jhbadger 8 hours ago | parent | next [-]

It's not that simple. For example, when libraries remove books for political reasons they often claim it isn't "censorship" because you could buy the book at a bookstore if you wanted. But if it really would have no effect on availability they wouldn't bother to remove the book, would they?

amanaplanacanal 7 hours ago | parent [-]

Libraries are typically run by the government. Governments aren't supposed to censor speech. Private platforms are a different matter by law.

ultrarunner 8 hours ago | parent | prev | next [-]

At some level these platforms are the public square and facilitate public discussion. In fact, Google has explicitly deprioritized public forum sites (e.g. PHPbb) in preference to forums like YouTube. Surely there is a difference between declining to host and distribute adult material and enforcing a preferred viewpoint on a current topic.

Sure, Google doesn't need to host anything they don't want to; make it all Nazi apologia if they thing it serves their shareholders. But doing so and silencing all other viewpoints in that particular medium is surely not a net benefit for society, independent of how it affects Google.

Scoundreller 8 hours ago | parent | next [-]

“Covid” related search results were definitely hard-coded or given a hand-tuned boost. Wikipedia was landing on the 2nd or 3rd page which never happens for a general search term on Google.

I’d even search for “coronavirus” and primarily get “official” sites about Covid-19 even tho that’s just one of many coronaviruses. At least Wikipedia makes the front page again, with the Covid-19 page outranking the coronavirus page…

Scoundreller 8 hours ago | parent | prev [-]

“Covid” related search results were definitely hard-coded. Wikipedia was landing on the 2nd or 3rd page which never happens.

I’d even search for “coronavirus” and primarily get “official” sites about Covid-19 even tho that’s just one of many coronaviruses. At least Wikipedia makes the front page again, with the Covid-19 page outranking the coronavirus page…

sterlind 8 hours ago | parent | prev | next [-]

I'd certainly consider an ISP refusing to route my packets as silencing. is YouTube so different? legally, sure, but practically?

michaelt 7 hours ago | parent | next [-]

If we were still in the age of personal blogs and phpbb forums, where there were thousands of different venues - the fact the chess forum would ban you for discussing checkers was no problem at all.

But these days, when you can count the forums on one hand even if you're missing a few fingers, and they all have extremely similar (American-style) censorship policies? To me it's less clear than it once was.

jabwd 7 hours ago | parent | prev | next [-]

yes... coz youtube is not your ISP. A literal massive difference. RE: net neutrality.

scarface_74 5 hours ago | parent | prev [-]

No because you are perfectly technically capable of setting your own servers in a colo and distributing your video.

unyttigfjelltol 7 hours ago | parent | prev | next [-]

> My refusing to distribute your work is not "silencing."

That distinction is a relic of a world of truly public spaces used for communication— a literal town square. Then it became the malls and shopping centers, then the Internet— which runs on private pipes— and now it’s technological walled gardens. Being excluded from a walled garden now is effectively being “silenced” the same way being excluded from the town square was when whatever case law you’re thinking was decided.

pfannkuchen 8 hours ago | parent | prev | next [-]

I think the feeling of silencing comes from it being a blacklist and not a whitelist.

If you take proposals from whoever and then only approve ones you specifically like, for whatever reason, then I don’t think anyone would feel silenced by that.

If you take anything from anyone, and a huge volume of it, on any topic and you don’t care what, except for a few politically controversial areas, that feels more like silencing. Especially when there is no alternative service available due to network effects and subsidies from arguably monopolistic practices.

mock-possum 7 hours ago | parent [-]

Also allowing it to be posted initially for a period of time before being taken down feels worse than simply preventing it from ever being published on your platform to begin with.

Of course they would never check things before allowing them to be posted because there isn’t any profit in that.

Jensson 8 hours ago | parent | prev | next [-]

> No one owes you distribution unless you have a contract saying otherwise.

The common carrier law says you have to for for some things, so it makes sense to institute such a law for some parts of social media as they are fundamental enough. It is insane that we give that much censorship power to private corporations. They shouldn't have the power to decide elections on a whim etc.

AfterHIA 8 hours ago | parent [-]

I 100% agree with your sentiment here Jensson but in Googling, "common carrier law" what I get are the sets of laws governing transportation services liability:

https://en.wikipedia.org/wiki/Common_carrier

Is there perhaps another name for what you're describing? It piques my interest.

Jensson 7 hours ago | parent [-]

Common carrier also applies to phones and electricity and so on, it is what prevents your phone service provider from deciding who you can call or what you can say. Imagine a world where your phone service provider could beep out all your swear words, or if they prevented you from calling certain people, that is what common carrier prevents.

So the equivalent of Google banning anyone talking about Covid is the same as a phone service provider ending service for anyone mentioning covid on their phones. Nobody but the most extreme authoritarians thinks phone providers should be allowed to do that, so why not apply this to Google as well?

amanaplanacanal 7 hours ago | parent [-]

This is essentially the free speech maximalist position: allow any legal content.

If they did that, people would leave the service in droves for a competitor with reasonable moderation. Nobody wants to use a site that is overrun with spam and porn.

nradov 4 hours ago | parent | next [-]

Perhaps. But another approach would be to give users better filtering features so that they wouldn't see content they consider objectionable, even if it's not censored and still readily available to other users.

Jensson 7 hours ago | parent | prev [-]

> If they did that, people would leave the service in droves for a competitor with reasonable moderation.

Did people leave Google in droves in favor of a competitor that censors out all porn from search results? No, people had no issue that you can find porn on Google, they still used it. Youtube providing porn to those who want it does not cause problems for anyone, just like it doesn't for Google search, and Google even run both so they can easily apply this same feature on Youtube.

> Nobody wants to use a site that is overrun with spam and porn.

The internet is overrun by spam and porn yet people still use it, so you are clearly wrong. Google already manages as search engine over the internet that is capable of not showing you porn when you don't search for it, but you can find it if you do, so Google has already solved that problem and could just do the same in Youtube.

amanaplanacanal 6 hours ago | parent [-]

Note that we are having this conversation on a site with heavy moderation. I doubt removing this moderation would in any way make the site better.

You might ask yourself why you are here, instead of another website with less or no moderation.

Jensson 33 minutes ago | parent [-]

The only reason we need moderation is that we have discussions, youtube videos doesn't have that feature, you can't attach a video to another persons video, but you can attach a comment here to another persons comment. I am all for moderating youtube comments for that very reason, but not youtube videos.

I would prefer if discord / reddit and similar became common carriers of forums, not messages. So discord and reddit can't control what a subreddit does and what its moderators do, but the moderators can control what the people posting there can do.

By having a common carrier forum provider anyone could easily make their own forum with their own rules and compete on an open market without needing any technical skills, and without the forum provider being able to veto everything they say and do on that forum. That is where we want to be, in such an environment HN wouldn't need to depend on ycombinator, you could have many independently moderated forums and you pick the best one.

Discord and reddit today aren't that, both ban things they don't like, it would be much better if we removed that power from them. Both reddit and discord admins allows porn and spam, their censorship adds zero value to the platform, the only thing it does is kick some political factions out of the platform which doesn't add any value to it, as I wouldn't visit those discords / subreddits anyway so they don't hurt me.

So it isn't hard to imagine how to draft such laws where all our favorite usecases are still allowed while also adding much more freedom for users and making life easier for these content platforms since they are no longer targeted by takedown request spam, it is a win win for everyone except those who want to censor.

timmg 8 hours ago | parent | prev | next [-]

It's interesting how much "they are a private company, they can do what they want" was the talking point around that time. And then Musk bought Twitter and people accuse him of using it to swing the election or whatever.

Even today, I was listening to NPR talk about the potential TikTok deal and the commenter was wringing their hands about having a "rich guy" like Larry Ellison control the content.

I don't know exactly what the right answer is. But given their reach -- and the fact that a lot of these companies are near monopolies -- I think we should at least do more than just shrug and say, "they can do what they want."

typeofhuman 8 hours ago | parent | prev | next [-]

Not OP, but we did learn the US federal government was instructing social media sites like Twitter to remove content it found displeasing. This is known as jawboning and is against the law.

SCOTUS. Bantam Books, Inc. v. Sullivan, holds that governments cannot coerce private entities into censoring speech they disfavor, even if they do not issue direct legal orders.

This was a publicly announced motivation for Elon Musk buying Twitter. Because of which we know the extent of this illegal behavior.

Mark Zuckerberg has also publicly stated Meta was asked to remove content by the US government.

brookst 7 hours ago | parent [-]

Crazy how fast we got from “please remove health misinformation during a pandemic” (bad) to “FCC chair says government will revoke broadcast licenses for showing comedians mocking the president” (arguably considerably worse).

themaninthedark 6 hours ago | parent | next [-]

>On July 20, White House Communications Director Kate Bedingfield appeared on MSNBC. Host Mika Brzezinski asked Bedingfield about Biden's efforts to counter vaccine misinformation; apparently dissatisfied with Bedingfield's response that Biden would continue to "call it out," Brzezinski raised the specter of amending Section 230—the federal statute that shields tech platforms from liability—in order to punish social media companies explicitly.

>In April 2021, White House advisers met with Twitter content moderators. The moderators believed the meeting had gone well, but noted in a private Slack discussion that they had fielded "one really tough question about why Alex Berenson hasn't been kicked off from the platform."

Is there a difference between the White House stating they are looking at Section 230 and asking why this one guy has not been banned?

slater 6 hours ago | parent [-]

from your paste, it looks like Mika B. brought up the section 230 thing?

Also, spreading disinformation about covid has real-world implications.

Orange man getting his feelings hurt because comedian said something isn't even in the same ballpark

themaninthedark 4 hours ago | parent | next [-]

Sorry, I only grabbed part of the quote. Here is it paraphrased as the names are not that familiar to me.

"Shouldn't they(Facebook and Twitter) be liable for publishing that information and then open to lawsuits?" - MSNBC "Certainly, they should be held accountable, You've heard the president speak very aggressively about this. He understands this is an important piece of the ecosystem." - White House Communications Director Kate Bedingfield

Source: https://reason.com/2023/01/19/how-the-cdc-became-the-speech-...

So yes, MSNBC brought up Section 230 and the White House Communications Director says "Yes, we are looking to hold social media accountable."

>Also from the same source: The Twitter moderators believed the meeting had gone well, but noted in a private Slack discussion that they had fielded "one really tough question about why Alex Berenson hasn't been kicked off from the platform."

>Throughout 2020 and 2021, Berenson had remained in contact with Twitter executives and received assurances from them that the platform respected public debate. These conversations gave Berenson no reason to think his account was at risk. But four hours after Biden accused social media companies of killing people, Twitter suspended Berenson's account.

I don't care about Trump's feelings but if we want to be able to speak truth to power, we have to be willing to let people talk shit as well. Yes, COVID has real world implications. Almost everything does.

People on the left say "Think about the children and implications with regard to this." People on the right say "Think about the children and implications with regard to that."

Notice how none of them seem to be saying "Let's lay out the facts and let you think about it."

tbrownaw 4 hours ago | parent | prev | next [-]

Preventing people from disputing claims of fact makes it harder to find out if those claims are actually solid. Same for arguments. https://www.goodreads.com/quotes/66643-he-who-knows-only-his...

Preventing people from having a platform for content-free asshattery doesn't have that problem.

(A fun implication of this line is reasoning, is that the claim that Kimmel's comments were "lies" makes the jawboning against him more morally bad rather than less bad.)

typeofhuman 6 hours ago | parent | prev [-]

> Also, spreading disinformation about covid has real-world implications.

Your logic can be used to censor anything that goes against the narratives of the arbiters of disinformation.

> Orange man getting his feelings hurt because comedian said something isn't even in the same ballpark

Pejorative. Lack of evidence. Ignoring contradictory evidence. Sounds like you are locked in.

typeofhuman 6 hours ago | parent | prev [-]

If you're referring to Jimmy Kimmel. You should probably consider that while the FCC member made that comment, Sinclair (the largest ABC affiliate group) and others had been demanding ABC cancel his show for its horrible ratings, and awful rhetoric which inhibited them from selling advertising. His show was bad for business. It's worth suspecting ABC let no good opportunity go to waste: save Kimmel's reputation and scapegoat the termination as political.

More here: https://sbgi.net/sinclair-says-kimmel-suspension-is-not-enou...

alphabettsy 3 hours ago | parent | next [-]

Personally, not going to take Sinclair‘s press release at face value.

https://www.politico.com/story/2017/08/06/trump-fcc-sinclair...

https://upriseri.com/sinclair-nexstar-duopoly-right-wing-con...

brookst 3 hours ago | parent | prev [-]

I can’t figure out what you’re trying to say. It’s no big deal that the head of the FCC says they’ll pull licenses for media outlets that mock the president, because one media outlet says that would be the right commercial decision anyway?

That can’t be your point, but I also can’t think of a more charitable interpretation.

Ekaros 2 hours ago | parent | prev | next [-]

If you refuse to distribute some information you are making editorial decision. Clearly you are reviewing all of the content. So you should be fully liable for all content that remains. Including things like libel or copyright violation.

To me that sounds only fair trade. You editorialize content. You are liable for all content. In every possible way.

justinhj 8 hours ago | parent | prev | next [-]

So you're saying that YouTube is a publisher and should not have section 230 protections? They can't have it both ways. Sure remove content that violates policies but YouTube has long set itself up as an opinion police force, choosing which ideas can be published and monetized and which cannot.

tzs 7 hours ago | parent | next [-]

Section 230 does not work like you think it does. In fact it is almost opposite of what you probably think it does. The whole point was to allow them to have it both ways.

It makes sites not count as the publisher or speaker of third party content posted to their site, even if they remove or moderate that third party content.

bee_rider 8 hours ago | parent | prev | next [-]

YouTube’s business model probably wouldn’t work if they were made to be responsible for all the content they broadcasted. It would be really interesting to see a world where social media companies were treated as publishers.

Might be a boon for federated services—smaller servers, finer-grained units of responsibility…

krapp 8 hours ago | parent | prev [-]

https://www.techdirt.com/2020/06/23/hello-youve-been-referre...

justinhj 7 hours ago | parent [-]

Thank you. I was completely wrong about section 230.

joannanewsom 7 hours ago | parent | prev [-]

Jimmy Kimmel wasn't being silenced. He doesn't have a right to a late night talk show. Disney is free to end that agreement within the bounds of their contract. Being fired for social media posts isn't being silenced. Employment is for the most part at will. Getting deported for protesting the Gaza war isn't being silenced. Visas come with limitations, and the US government has the authority to revoke your visa if you break those rules. /s

You seem to think there's a bright line of "silenced" vs "not silenced". In reality there's many ways of limiting and restricting people's expressions. Some are generally considered acceptable and some are not. When huge swaths of communication are controlled by a handful of companies, their decisions have a huge impact on what speech gets suppressed. We should interrogate whether that serves the public interest.

amanaplanacanal 7 hours ago | parent | next [-]

The US has pretty much given up on antitrust enforcement. That's the big problem.

scarface_74 5 hours ago | parent | prev [-]

The federal government was literally pressuring ABC to take Kimmel off the air. Even Ted Cruz and other prominent republicans said that was a bridge too far.

sazylusan 8 hours ago | parent | prev | next [-]

Perhaps free speech isn't the problem, but free speech x algorithmic feeds is? As we all know the algorithm favors the dramatic, controversial, etc. That creates an uneven marketplace for free speech where the most subversive and contrarian takes essentially have a megaphone over everyone else.

cptnapalm 8 hours ago | parent | next [-]

As I understand it, Twitter has something called Community Notes. So people can write things, but it can potentially have an attached refutation.

prisenco 7 hours ago | parent [-]

Community notes is better than nothing, but they only relate to a single tweet. So if one tweet with misinformation gets 100k likes, then a community note might show up correcting it.

But if 100 tweets each get 1000 likes, they're never singularly important enough to community note.

cptnapalm 7 hours ago | parent [-]

Fair enough on that. The problem I've seen (and don't have a good idea for how to fix) is on Reddit where the most terminally online are the worst offenders and they simply drown out everything else until non-crazy people just leave. It doesn't help that the subreddit mods are disproportionately also the terminally online.

AfterHIA 8 hours ago | parent | prev | next [-]

I feel that this is the right approach-- the liability and toxicity of the platforms isn't due to them being communication platforms it's because in most practical or technical ways they are not: they are deliberate behavior modification schemes where-in companies are willfully inflaming their customer's political and social sentiments for profit in exchange for access to the addictive platform. It's like free digital weed but the catch is that it makes you angry and politically divisive.

In this sense platforms like X need to be regulated more like gambling. In some ways X is a big roulette wheel that's being spun which will help stochastically determine where the next major school shooting will take place.

prisenco 8 hours ago | parent [-]

Right, engagement algorithms are like giving bad takes a rocket ship.

The words of world renown epidemiologists who were, to be frank, boring and unentertaining could never possibly compete with crunchymom44628 yelling about how Chinese food causes covid.

Bad takes have the advantage of the engagement of both the people who vehemently agree and the people who vehemently disagree. Everyone is incentivized to be a shock jock. And the shock jocks are then molded by the algorithm to be ever more shock jockish.

Especially at a time when we were all thrown out of the streets and into our homes and online.

And here I'll end this by suggesting everyone watch Eddington.

sazylusan 8 hours ago | parent | prev | next [-]

Building on that, the crazy person spouting conspiracy theories in the town square, who would have been largely ignored in the past, suddenly becomes the most visible.

The first amendment was written in the 1700s...

hn_throwaway_99 8 hours ago | parent | prev [-]

Glad to see this, was going to make a similar comment.

People should be free to say what they want online. But going down "YouTube conspiracy theory" rabbit holes is a real thing, and YouTube doesn't need to make that any easier, or recommend extreme (or demonstrably false) content because it leads to more "engagement".

squigz 8 hours ago | parent [-]

Online, sure. But online doesn't mean YouTube or Facebook.

yongjik 8 hours ago | parent | prev | next [-]

I feel like we're living in different worlds, because from what I've seen, giving people platforms clearly doesn't work either. It just lets the most stupid and incendiary ideas to spread unchecked.

If you allow crazy people to "let it ride" then they don't stop until... until... hell we're still in the middle of it and I don't even know when or if they will stop.

atmavatar 8 hours ago | parent | next [-]

I wonder how much of that is giving a platform to conspiracy theorists and how much of it is the social media algorithms' manipulation making the conspiracy theories significantly more visible and persuasive.

prawn 5 hours ago | parent [-]

Is there any consideration of this with regard to Section 230? e.g., you're a passive conduit if you allow something to go online, but you're an active publisher if you actively employ any form of algorithm to publish and promote?

mac-attack 7 hours ago | parent | prev [-]

It's poorly thought out logic. Everyone sees how messy and how mistakes can be made when attempting to get to a truth backed by data + science, so they somehow they conclude that allowing misinformation to flourish will solve the problem instead of leading to a slow decline of morality/civilization.

Very analogous to people who don't like how inefficient governments function and somehow conclude that the solution is to put people in power with zero experience managing government.

mitthrowaway2 7 hours ago | parent [-]

There's a journey that every hypothesis makes on the route to becoming "information", and that journey doesn't start at top-down official recognition. Ideas have to circulate, get evaluated and rejected and accepted by different groups, and eventually grasp their way towards consensus.

I don't believe Trump's or Kennedy's ideas about COVID and medicine are the ones that deserve to win out, but I do think that top-down suppression of ideas can be very harmful to truth seeking and was harmful during the pandemic. In North America I believe this led to a delayed (and ultimately minimal) social adoption of masks, a late acceptance of the aerosol-spread vector, an over-emphasis on hand washing, and a far-too-late restriction on international travel and mass public events, well past the point when it could have contributed to containing the disease (vs Taiwan's much more effective management, for example).

Of course there's no guarantee that those ideas would have been accepted in time to matter had there been a freer market for views, and of course it would have opened the door to more incorrect ideas as well, but I'm of the view that it would have helped.

More importantly I think those heavy restrictions on pre-consensus ideas (as many of them would later become consensus) helped lead to a broader undermining of trust in institutions, the fallout of which we are observing today.

mac-attack 6 hours ago | parent [-]

The issues you are bringing up don't highlight that they stuck with the wrong decision, but rather that they didn't pivot to the right decision as fast as you'd like... yet your solution is bottom-up decision-making that will undoubtedly take much much longer to reach a consensus? How do you square that circle?

Experts can study and learn from their prior mistakes. Continually doing bottom-up when we have experts is inefficient and short-sighted, no? Surely you would streamline part of the process and end up in the pre-Trump framework yet again?

Also, I'm curious why you have such a rosy picture of the bottom-up alternatives? Are you forgetting about the ivermectin overdoses? 17,000 deaths related to hydroxychloroquine? The US president suggesting people drinking bleach? It is easy to cherry pick the mistakes that science makes while overlooking the noise and misinformation that worms its way into less-informed/less-educated thinkers when non-experts are given the reins

mitthrowaway2 5 hours ago | parent [-]

No, I'm not criticizing the officials for failing to reach the correct decision or adopt the correct viewpoints faster than they did. Institutions are large and risk-averse, data was incomplete, and people make mistakes.

I'm criticizing them for suppressing the dissemination of ideas that did later turn out to be correct. I hope the distinction is clear.

If you're going to impose a ban on the dissemination of ideas, you'd better be ten thousand percent sure that nothing covered by that ban later turns out to be the truth. Not a single one, not even if every other idea that got banned was correctly identified as a falsehood. Otherwise, the whole apparatus falls apart and institutions lose trust.

I'm not forgetting ivermectin overdoses. I don't believe my picture is rosy. I'm aware of all the garbage ideas out there, which is why the measles is back and all the other madness. But I'm firmly of the opinion that trying to suppress these bad ideas has only redoubled their strength in the backlash, and caused a rejection of expert knowledge altogether.

Zanfa 3 hours ago | parent | prev | next [-]

IMO free speech requires moderation, but the "how" is an unsolved problem. In a completely unmoderated environment, free speech will be drowned out by propaganda from your adversaries. The decades of experience and the industrial scale that Russian (or similar) troll factories can manufacture grassroots content or fund influencers is not something that can be combated at an individual level.

It would be a mistake to think such operations care too much about specific talking points, the goal is to drown out moderate discussion to replace it with flamewars. It's a numbers game, so they'll push in hundreds of different directions until they find something that sticks and also both sides of the same conflict.

Aloha 10 hours ago | parent | prev | next [-]

I think it made sense as a tactical choice at the moment, just like censorship during wartime - I dont think it should go on forever, because doing so is incompatible with a free society.

llm_nerd 9 hours ago | parent [-]

It didn't even make sense at the time. It tainted everything under a cloud that the official, accepted truth needed to suppress alternatives to win the battle of minds. It was disastrous, and it is astonishing seeing people (not you, but in these comments) still trying to paint it as a good choice.

It massively amplified the nuts. It brought it to the mainstream.

I'm a bit amazed seeing people still justifying it after all we've learned.

COVID was handled terribly after the first month or so, and hopefully we've learned from that. We're going to endure the negative consequences for years.

And to state my position like the root guy, I'm a progressive, pro-vaccine, medical science believer. I listen to my doctor and am skeptical if not dismissive of the YouTube "wellness" grifters selling scam supplements. I believe in science and research. I thought the worm pill people were sad if not pathetic. Anyone who gets triggered by someone wearing a mask needs to reassess their entire life.

But lockdowns went on way too long. Limits on behaviour went on way too long. Vaccine compliance measures were destructive the moment we knew it had a negligible effect on spread. When platforms of "good intentions" people started silencing the imbeciles, it handed them a megaphone and made the problem much worse.

And now we're living in the consequences. Where we have a worm-addled halfwit directed medicine for his child-rapist pal.

LeafItAlone 8 hours ago | parent [-]

>It massively amplified the nuts. It brought it to the mainstream.

>COVID was handled terribly after the first month or so, and hopefully we've learned from that. We're going to endure the negative consequences for years.

In theory, I agree, kind of.

But also - we were 10+ months into COVID raging in the US before Biden’s administration, the administration that enacted the policies the article is about, came to be. Vaccine production and approval were well under way, brought to fruition in part due to the first Trump administration. The “nuts” had long been mainstream and amplified before this “silencing” began. Misinformation was rampant and people were spreading it at a quick speed. Most people I know who ultimately refused the vaccines made up their minds before Biden took office.

jasonlotito 5 hours ago | parent | next [-]

> But also - we were 10+ months into COVID raging in the US before Biden’s administration, the administration that enacted the policies the article is about, came to be.

Google makes it very clear that these were choices they made, and were independent of whatever the government was asking. Suggesting these policies are anything other than Google's is lying.

llm_nerd 8 hours ago | parent | prev [-]

Sure, but I'm not remotely blaming Biden[1]. A lot of tech companies took this on themselves, seeing themselves as arbiters of speech for a better world. Some admin (Trump admin) people might have given them suggestions, but they didn't have to do the strong-arm stuff, and the results weren't remotely helpful.

We already had a pretty strong undercurrent of contrarianism regarding public health already -- it's absolutely endemic on here, for instance, and was long before COVID -- but it mainstreamed it. Before COVID I had a neighbour that would always tell me hushed tones that he knows what's really going on because he's been learning about it on YouTube, etc. It was sad, but he was incredibly rare. Now that's like every other dude.

And over 80% of the US public got the vaccine! If we were to do COVID again, I doubt you'd hit even 40% in the US now. The problem is dramatically worse.

[1] That infamous Zuck interview with Rogan, where Zuck licked Trump's anus to ingratiate himself with the new admin, was amazing in that he kept blaming Biden for things Meta did long before Biden's admin took office or even took shape. Things he did at the urging of the Trump admin pt 1. I still marvel that he could be so astonishingly deceptive and people don't spit in his lying face for it.

electriclove 8 hours ago | parent | prev | next [-]

I agree and I’m pro vaccines but want the choice on if/when to vaccinate my kids. I believe there were election discrepancies but not sure if it was stolen. I felt the ZeroHedge article about lab leak was a reasonable possibility. All these things were shutdown by the powers that be (and this was not Trump’s fault). The people shutting down discourse are the problem.

amanaplanacanal 6 hours ago | parent [-]

You pretty much have the choice about vaccinating your kids. You might not be able to send them to public school without vaccinations though, depending on your local laws.

electriclove 2 hours ago | parent [-]

In California, it is required for public schools and many private schools also require it, so effectively it isn't much of a choice.

kypro 10 hours ago | parent | prev | next [-]

I agree. People today are far more anti-vaccine than they were a few years ago which is kinda crazy when you consider we went through a global pandemic where one of the only things that actually worked to stop people dying was the roll out of effective vaccines.

I think if public health bodies just laid out the data they had honestly (good and bad) and said that they think most people should probably take it, but left it to people to decide, the vast, vast majority of people would still have gotten the vaccine but we wouldn't have allowed anti-vaccine sentiment to fester.

trollbridge 10 hours ago | parent | next [-]

And the attempts at censorship have played a part in people drifting towards being more vaccine-hesitant or anti-vaccine.

It's often a lot better to just let kooks speak freely.

vFunct 9 hours ago | parent [-]

It's less about censorship and more about more people becoming middle-class and therefore thinking they're smarter than researchers.

There is nobody more confident in themselves than the middle-class.

khazhoux 9 hours ago | parent [-]

That’s a very confident statement presented without a hint of evidence.

vFunct 8 hours ago | parent [-]

You know that there are studies addressing this, right? I didn't just make it up.

Here's an overview study that reviewed other studies: https://jphe.amegroups.org/article/view/9493/html

"Pre-COVID-19 interviews with a high-income vaccine hesitant sample in Perth, Australia found that vaccine hesitancy was based on an inflated sense of agency in making medical decisions without doctors or public health officials, and a preference for “natural” methods of healthcare (30)."

"A similar study in the United States reported on interviews from 25 White mothers in a wealthy community who refused vaccination for their children (31). These participants reported high levels of perceived personal efficacy in making health decisions for their children and higher confidence in preventing illness through individual “natural” measures such as eating organic food and exercising. Additionally, these participants report lower perceived risk of infection or disease, which is contrasted with their high perceived risk of vaccination."

"Vaccine hesitancy among those with privilege may be more than just a product of resource access. There is evidence that individuals with high socioeconomic status perceive themselves to be more capable, hardworking, important, and deserving of resources and privileges than others (32,33)"

khazhoux 4 hours ago | parent [-]

You said the middle class is the most unreasonably confident group of people. I don't see anything to that effect in what you posted. Yes, I think it's just your made-up dismissive generalization.

gm678 7 hours ago | parent | prev | next [-]

That didn't happen in a vacuum; there was also a _lot_ of money going into pushing anti vaccine propaganda, both for mundane scam reasons and for political reasons: https://x.com/robert_zubrin/status/1863572439084699918?lang=...

nxm 4 hours ago | parent | prev | next [-]

Issue is when we weren't/aren't even allowed to question the efficacy or long-term side effects of any vaccine.

someNameIG 8 hours ago | parent | prev | next [-]

It's more that people in general* connect to personal stories far more than impersonal factual data. It's easy to connect to seeing people say they had adverse reactions to a vaccine than statistical data showing it's safer to get vaccinated than not. It's also easier to believe conspiracies, its easier to think bad things happen due to the intent of bad people, than the world being a complex hard to understand place with no intent behind things happening.

These are just things that some of the population will be more attracted to, I don't think it has anything to do with censorship, lockdowns, or mandates. At most the blame can be at institutions for lacking in their ability to do effective scientific communication.

*And this skews more to less educated and intelligent.

logicchains 9 hours ago | parent | prev | next [-]

>where one of the only things that actually worked to stop people dying was the roll out of effective vaccines.

The only reason you believe that is because all information to the contrary was systematically censored and removed from the media you consume. The actual data doesn't support that, there are even cases where it increased mortality, like https://pmc.ncbi.nlm.nih.gov/articles/PMC11278956/ and increased the chance of future covid infections, like https://pubmed.ncbi.nlm.nih.gov/39803093/ .

wvenable 8 hours ago | parent | next [-]

It isn't hard to find that randomized controlled trials and large meta-analyses show that COVID vaccines are highly effective. No need to rely on media. You can point to one or two observational re-analyses that show otherwise but overall they are not particularly convincing given the large body of easily accessible other evidence.

lisbbb 4 hours ago | parent [-]

I don't think a meta analysis is worth anything at all, to be totally honest with you. I also don't think those gene therapy shots were at all effective, given how many people contracted covid after receiving the full course of shots. I think basic herd immunity ended covid and the hysteria lasted far beyond the timeframe in which there was truly a problem. Furthermore, I think those shots are the cause of many cancers, including my wife's. The mechanism? The shots "programmed" the immune system to produce antibodies against covid to the detriment of all other functions, including producing the killer T-Cells that destroy cells in the early stages of becoming cancerous. That's why so many different cancers are happening, as well as other weird issues like the nasty and deadly clotting people had. I have no idea about mycarditis, but that's fine because it is a well documented side effect that has injured a lot of people. So cancer and pulmonary issues are the result of those poorly tested drugs that were given out to millions of people without informed consent and with no basic ethical controls on the whole massive experiment. And before you gaslight me, please understand that my wife, age 49 was diagnosed with a very unusual cancer for someone of her sex and age and it's been a terrible fight since June of 2024 to try and save her life, which has nearly been lost 3x already! Of course I have no proof that the Pfizer shots caused any of this, but damn, it sure could have been that. Also, her cousin, age 41, was diagnosed with breast cancer that same year. So tell me, how incredibly low probability is it that two people in the same family got cancer in the same year? It's got to be 1 in 10 million or something like that. Just don't gaslight me--we can agree to disagree. I'm living the worst case scenario post covid and I only hope my daughter, who also got the damn shots never comes down with cancer.

wvenable 3 hours ago | parent [-]

I am sorry to hear what you and your wife are going through. Nothing I say here is meant to dismiss your experience.

That said, I think it's important to separate personal experiences from what the larger body of evidence shows. Many vaccinated people still got COVID, especially once Omicron came along. The vaccines were never perfect at preventing infection. But the strongest data we have from randomized trials and real-world results show that vaccinated people were far less likely to end up in the ICU or die from COVID. That's what the vaccines were designed to do and that's where they consistently worked.

As for cancer, I understand why you'd connect your wife's diagnosis to the vaccine -- it's natural to search for causes -- our brains are wired to look for patterns especially when big events happen close together. But cancer registries and monitoring systems around the world haven't found an increase in cancer rates linked to COVID vaccines. The vaccines give a short-lived immune stimulus; they don't reprogram the immune system or permanently shut down T-cells. My family has a long history of cancer going back generations. Literally every other member of my family has had cancer long before COVID. The idea that there is a low probability of two people in the same family getting cancer in the same year is unfortunately not as unlikely as you want to believe. That is perhaps a cold comfort but doctors and scientists aren't seeing the pattern you're worried about.

That isn't to say there aren't side effects to the vaccine. Myocarditis and clotting problems are well documented but rare side-effects. In fact, someone I know about indirectly had a heart attack immediately after the COVID vaccine -- his family is genetically predisposed to this kind of heart attack but it was directly triggered by the shot (he survived). It's good to acknowledge those risks. But when you look at the big picture, health agencies estimate that the vaccines prevented millions of deaths. I sadly know of a few people who died from COVID prior to vaccine availability and have family members with permanent lung issues. They're currently struggling to get another COVID shot because they don't think they can survive getting it unprotected again.

rpiguy 9 hours ago | parent | prev | next [-]

I appreciate you.

People have become more anti-Vax because the Covid vaccines were at best ineffective and as you said anything contra-narrative is buried or ignored.

If you push a shitty product and force people to take it to keep their jobs it’s going to turn them into skeptics of all vaccines, even the very effective ones.

More harm than good was done there. The government should have approved them for voluntary use so the fallout would not have been so bad.

OrvalWintermute 8 hours ago | parent [-]

Throughout my life I always got vaccines without a question. Thought antivaxxers were nutty/crazy. When I was in the US military overseas I was stuck regularly as only world travelers going to disease hotspots are.

When they ignored my wife's medical allergy to vaccine ingredients while she was pregnant, and a medical friend in Europe warned me about people dying there due to the vaccine, I rethought my previous position.

Started crunching numbers.

Hearing of vaccine impurities and contamination w. SV40

Told by vet friends about side effects being suppressed from the DMED database

VAERS numbers seemed pretty bad

JHU numbers painted a very mixed story

Bioethics around informed consent disappeared

Read over vaccine production process and the filth it entails

Vaccine Mafia came out in force.

Am so thankful now that I did not get the vaccine and my eyes were opened by our Kleptocratic vaccine industry.... I always thought BigPharma was an issue, but didn't realize how tyrannical they could be via outsourcing enforcement to the federal, state, and local government in cahoots with Academia & Retail.

No Trust!

deepburner 7 hours ago | parent [-]

So those bots made it to hackernews huh

OrvalWintermute 2 hours ago | parent [-]

Apparently speaking for yourself Mr. 262 Karma

;)

cynicalkane 8 hours ago | parent | prev [-]

This is typical of Covid conspiracy theorists, or conspiracy theorists of any sort: one or two papers on one side prove something, but an overwhelming mountain of evidence on the other side does not prove something. The theorist makes no explanation as to how a planetful of scientists missed the obvious truth that some random dudes found; they just assert that it happened, or make some hand-waving explanation about how an inexplicable planet-wide force of censors is silencing the few unremarkable randos who somehow have the truth.

The first paper seems to claim a very standard cohort study is subject to "immortal time bias", an effect whereby measuring outcomes can seem to change them. The typical example of sampling time bias is that slow-growing cancers are more survivable than fast-growing ones, but also more likely to be measured by a screening, giving a correlation between screening and survivablility. So you get a time effect where more fast-acting cancers do not end up in the measurement, biasing the data.

But in measurements such that one outcome or the other does not bias the odds of that outcome being sampled, there can be no measurement time effect, which is why it's not corrected for in studies like this. The authors do not explain why measurement time effects would have anything to do with detecting or not detecting death rates in the abstract, or anywhere else in the paper, because they are quacks, who apply arbitrary math to get the outcome they want.

As another commenter pointed out, randomized controlled trials -- which cannot possibly have this made-up time effect -- often clearly show a strongly positive effect for vaccination.

I did not read the second paper.

lisbbb 4 hours ago | parent [-]

There is no conspiracy, the studies were all crap! They raced through them and failed at basic double blind experiments as well as giving control groups live shots afterwards, thus eliminating any retrospective studies. There was never any positive effect. It didn't exist. It's disgusting what happened and how so many professionals that we rely on to stand up and tell the truth knuckled under to the pressure of the moment and lied or turned their backs.

vkou 10 hours ago | parent | prev | next [-]

> but left it to people to decide, the vast, vast majority of people would still have gotten the vaccine but we wouldn't have allowed anti-vaccine sentiment to fester.

Nah, the same grifters who stand to make a political profit of turning everything into a wedge issue would have still hammered right into it. They've completely taken over public discourse on a wide range of subjects, that go well beyond COVID vaccines.

As long as you can make a dollar by telling people that their (and your) ignorance is worth just as much - or more - than someone else's knowledge, you'll find no shortage of listeners for your sermon. And that popularity will build its own social proof. (Millions of fools can't all be wrong, after all.)

kypro 10 hours ago | parent [-]

I agree. Again the vast majority would have gotten the vaccine.

There's always going to be people for all kinds of reasons pushing out bad ideas. That's part of the trade-off of living in a free society where there is no universal "right" opinion the public must hold.

> They've completely taken over public discourse on a wide range of subjects

Most people are not anti-vax. If "they've" "taken over public discourse" in other subjects to the point you are now holding a minority opinion you should consider whether "they" are right or wrong and why so many people believe what they do.

If can't understand their position and disagree you should reach out to people in a non-confrontational way, understand their position, then explain why you disagree (if you still do at that point). If we all do a better job at this we'll converge towards truth. If you think talking and debate isn't the solution to disagreements I'd argue you don't really believe in our democratic system (which isn't a judgement).

vel0city 6 hours ago | parent [-]

While I do agree "most people are not anti-vax", the rates of opting out of vaccines or doing delayed schedules or being very selective have gone way up.

Some of these public school districts in Texas have >10% of students objecting to vaccines. My kids are effectively surrounded by unvaccinated kids whenever they go out in public. There's a 1 in 10 chance that kid on the playground has never had a vaccine, and that rate is increasing.

A lot of the families I know actively having kids are pretty crunchy and are at least vaccine hesitant if not outright anti-vax.

https://www.dshs.texas.gov/sites/default/files/LIDS-Immuniza...

stefantalpalaru 10 hours ago | parent | prev | next [-]

> one of the only things that actually worked to stop people dying was the roll out of effective vaccines

"A total of 913 participants were included in the final analysis. The adjusted ORs for COVID-19 infection among vaccinated individuals compared to unvaccinated individuals were 1.85 (95% CI: 1.33-2.57, p < 0.001). The odds of contracting COVID-19 increased with the number of vaccine doses: one to two doses (OR: 1.63, 95% CI: 1.08-2.46, p = 0.020), three to four doses (OR: 2.04, 95% CI: 1.35-3.08, p = 0.001), and five to seven doses (OR: 2.21, 95% CI: 1.07-4.56, p = 0.033)." - ["Behavioral and Health Outcomes of mRNA COVID-19 Vaccination: A Case-Control Study in Japanese Small and Medium-Sized Enterprises" (2024)](https://www.cureus.com/articles/313843-behavioral-and-health...)

"the bivalent-vaccinated group had a slightly but statistically significantly higher infection rate than the unvaccinated group in the statewide category and the age ≥50 years category" - ["COVID-19 Infection Rates in Vaccinated and Unvaccinated Inmates: A Retrospective Cohort Study" (2023)](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10482361/)

"The risk of COVID-19 also varied by the number of COVID-19 vaccine doses previously received. The higher the number of vaccines previously received, the higher the risk of contracting COVID-19" - ["Effectiveness of the Coronavirus Disease 2019 (COVID-19) Bivalent Vaccine" (2022)](https://www.medrxiv.org/content/10.1101/2022.12.17.22283625v...)

"Confirmed infection rates increased according to time elapsed since the last immunity-conferring event in all cohorts. For unvaccinated previously infected individuals they increased from 10.5 per 100,000 risk-days for those previously infected 4-6 months ago to 30.2 for those previously infected over a year ago. For individuals receiving a single dose following prior infection they increased from 3.7 per 100,000 person days among those vaccinated in the past two months to 11.6 for those vaccinated over 6 months ago. For vaccinated previously uninfected individuals the rate per 100,000 person days increased from 21.1 for persons vaccinated within the first two months to 88.9 for those vaccinated more than 6 months ago." - ["Protection and waning of natural and hybrid COVID-19 immunity" (2021)](https://www.medrxiv.org/content/10.1101/2021.12.04.21267114v...)

boxerab 7 hours ago | parent | prev [-]

Yes! This MUST be why the VAERS adverse event tracker went through the roof right after the rollout began, and why excess death remains sky high in many countries to this day - because a product that didn't stop you from catching or spreading the virus was one of the only things preventing deaths. Couldn't have been our, you know, immune system or anything like that, or that the average age at death was 80 along with several co-morbidities.

dyauspitr 35 minutes ago | parent | prev | next [-]

Silencing people is the only thing that works is what I’ve learned on the internet.

stinkbeetle 6 hours ago | parent | prev | next [-]

For that matter why is it even such a crazy wild idea for anybody to dare to question medicines and motives from pharmaceutical companies? Or question elections?

Both have always been massively shady. I'm old enough to remember the big stink around the Al Gore election loss, or the robust questioning of the 2016 election for that matter. So ridiculous for self-proclaimed defenders of democracy to want to ban the discussion and disagreement about the facts around elections. Democratic processes and institutions should be open to doubt, questioning, and discussion.

The response to covid vaccines was actually extremely rational. They were highly taken up by the elderly who were shown to have the greatest risk, despite that demographic skewing more conservative (and arguably could be most at risk of "misinformation" from social media). And they were not able to stop transmission or provide much benefit to children and younger people, so they didn't get taken up so much among those groups. So there was really no need for this massive sustained psychological campaign of fearmongering, divisiveness, censorship, and mandates. They could have just presented the data and the facts as they came to hand, and be done with it.

dotnet00 4 hours ago | parent [-]

With medicine there's pushback because the vast majority of the time, someone's scamming you and you likely don't actually know what you're talking about, we had a ton of this during covid, radioactive jewelery that was supposed to protect you, cow piss (I personally know people who tried this...), 5G towers (actual damage done to all sorts of towers), Ivermectin, Hydrochloroquine and more. People who are sick or have a sick loved one are especially vulnerable to these sorts of things (there's an example of such a victim in the comments), and often end up making things worse by waiting too long or causing further damage.

With questioning elections, I think Jan 6 would be a pretty good indication of why it wasn't appropriate? This wasn't how questioning the results of elections goes in democracies. Instead, even after courts had investigated, the outgoing president refused to accept the result without any substantiated evidence.

stinkbeetle 3 hours ago | parent [-]

> With medicine there's pushback because the vast majority of the time, someone's scamming you and you likely don't actually know what you're talking about, we had a ton of this during covid, radioactive jewelery that was supposed to protect you, cow piss (I personally know people who tried this...), 5G towers (actual damage done to all sorts of towers), Ivermectin, Hydrochloroquine and more. People who are sick or have a sick loved one are especially vulnerable to these sorts of things (there's an example of such a victim in the comments), and often end up making things worse by waiting too long or causing further damage.

Pushback on what? There's always been new age hippy garbage, Chinese medicine, curing cancer with berries, and that kind of thing around. I don't see that causing much damage and certainly not enough to warrant censorship. People can easily see through it and in the end they believe what they want to believe.

Far far more dangerous and the cause of real damage that I have seen come from the pharmaceutical industry and their captured regulators. Bribing medical professionals, unconscionable public advertising practices, conspiring to push opioids on the population, lying about the cost to produce medications, and on and on. There's like, a massive list of the disasters these greedy corporations and their spineless co-conspirators in government regulators have caused.

Good thing we can question them, their motives, their products.

> With questioning elections, I think Jan 6 would be a pretty good indication of why it wasn't appropriate?

I don't understand your question. Can you explain why you think Jan 6 would be a pretty good indication that discussion and disagreement about elections should be censored?

> This wasn't how questioning the results of elections goes in democracies. Instead, even after courts had investigated, the outgoing president refused to accept the result without any substantiated evidence.

I never quite followed exactly were the legal issues around that election. Trump was alleged to have tried to illegally influence some election process and/or obstructed legal transfer of power. Additionally there was a riot of people who thought Trump won and some broke into congress and tried to intimidate law makers.

I mean taking the worst possible scenario, Trump knew he lost and was scheming a plan to seize power and was secretly transmitting instructions to this mob to enter the building and take lawmakers hostage or something like that. Or any other scenario you like, let your imagination go wild.

I still fail to see how that could possibly justify censorship of the people and prohibiting them from questioning the government or its democratic processes. In fact the opposite, a government official went rogue and committed a bunch of crimes so therefore... the people should not be permitted to question or discuss the government and its actions?

There are presumably laws against those actions of rioting, insurrection, etc. Why, if the guilty could be prosecuted with those crimes, should the innocent pay with the destruction of their human rights, in a way that wouldn't even solve the problem and could easily enable worse atrocities be committed by the government in future?

Should people who question the 2024 election be censored? Should people who have concerns with the messages from the government's foremost immigration and deportation "experts" be prohibited from discussing their views or protesting the government's actions?

dotnet00 2 hours ago | parent [-]

Robbery is a crime, so why should people take any measures to protect their things from being stolen? Murder is a crime, so why care about death threats?

New age medicine has been around forever, yes. But the effects are only known to be negligible outside of pandemics. We know from history that people did many irrational things during past pandemics due to fear and social contagion.

It's a tough problem, everyone believes themselves an expert on everything, plus trolls and disinformation campaigns. There's also a significant information asymmetry.

It's funny you mention opioids as I just recently came across a tweet claiming that Indians were responsible for getting Americans addicted to them via prescription. In one of the buried reply chains, the poster admits they have no evidence and are just repeating a claim someone made to them sometime. But how many people will read that initial post and reinforce their racist beliefs vs see that the claim was unsubstantiated? And when that leads to drastic action by a madman, who's going to be the target of the blame? The responsibility is too diffused to target any specific person, the government obviously won't, madmen don't act in a vacuum and so the blame falls on the platform.

Yes, no one should have the power to determine what ideas are and are not allowed to propagate, but on the other hand, you could still go to other platforms and are not entitled to the reach of the major platforms, but then again, these platforms are extremely influential. At the same time there's also the problem that people in part view the platforms as responsible when they spread bad ideas, the platform operators also feel some level of social responsibility, while the platform owners don't want legal responsibility.

wvenable 8 hours ago | parent | prev | next [-]

> But I think we have to realize silencing people doesn't work.

Doesn't it though? I've seen this repeated like it's fact but I don't think that's true. If you disallowed all of some random chosen conspiracy off of YouTube and other mainstream platforms I think it would stop being part of the larger public consciousness pretty quickly.

Many of these things arrived out of nothing and can disappear just as easily.

It's basic human nature that simply hearing things repeated over and over embeds it into your consciousness. If you're not careful and aware of what you're consuming then that becomes a part of your world view. The most effective way to bring people back from conspiratorial thinking (like QAnon) is to unplug them from that source of information.

fullshark 7 hours ago | parent | prev | next [-]

It works 99% of the time and you are overindexing on the 1% of the time it doesn’t to draw your conclusion.

yojo 9 hours ago | parent | prev | next [-]

I think there's a difference between silencing people, and having an algorithm that railroads people down a polarization hole.

My biggest problem with YouTube isn't what it does/doesn't allow on its platform, it's that it will happily feed you a psychotic worldview if it keeps you on the site. I've had several family members go full conspiracy nut-job after engaging heavily with YouTube content.

I don't know what the answer is. I think many people would rightly argue that removing misinformation from the recommendation engine is synonymous with banning it. FWIW I'd be happy if recommendation engines generally were banned for being a societal cancer, but I'm probably in the minority here.

trinsic2 7 hours ago | parent | prev | next [-]

They don't silence people to stop narratives. People are silenced to cause divisions and to exert control over the population. When people stop using tech they don't control and supporting people or systems that do not have their best interests a heart, only then will we see reach change.

brookst 7 hours ago | parent [-]

There is no conspiracy. It’s all emergent behavior by large groups of uncoordinated dunces who can’t keep even the most basic of secrets.

braiamp 8 hours ago | parent | prev | next [-]

> But I think we have to realize silencing people doesn't work

It actually does work. You need to remove ways for misinformation to spread, and suppressing a couple of big agents works very well.

- https://www.nature.com/articles/s41586-024-07524-8 - https://www.tandfonline.com/doi/full/10.1080/1369118X.2021.1... - https://dl.acm.org/doi/abs/10.1145/3479525 - https://arxiv.org/pdf/2212.11864

Obviously, the best solution would be prevention, by having good education systems and arming common people with the weapons to assess and criticize information, but we are kinda weak on that front.

hash872 10 hours ago | parent | prev | next [-]

It's their private property, they can ban or promote any ideas that they want to. You're free to not use their property if you disagree with that.

If 'silencing people' doesn't work- so online platforms aren't allowed to remove anything? Is there any limit to this philosophy? So you think platforms can't remove:

Holocaust denial? Clothed underage content? Reddit banned r/jailbait, but you think that's impermissible? How about clothed pictures of toddlers but presented in a sexual context? It would be 'silencing' if a platform wanted to remove that from their private property? Bomb or weapons-making tutorials? Dangerous fads that idiotic kids pass around on TikTok, like the blackout game? You're saying it's not permissible for a platform to remove dangerous instructionals specifically targeted at children? How about spam? Commercial advertising is legally speech in the US. Platforms can't remove the gigantic quantities of spam they suffer from every day?

Where's the limiting principle here? Why don't we just allow companies to set their own rules on their own private property, wouldn't that be a lot simpler?

softwaredoug 10 hours ago | parent | next [-]

I used to believe this. But I feel more and more we need to promote a culture of free speech that goes beyond the literal first amendment. We have to tolerate weird and dangerous ideas.

andrewmcwatters 9 hours ago | parent [-]

Better out in the open with refutations or warnings than in the dark where concepts become physical dangers.

benjiro 9 hours ago | parent [-]

Refuting does not work... You can throw scientific study upon study, doctor upon doctor, ... negatives run deeper then positives.

In the open, it becomes normalized, it draws in more people. Do you rather have some crazies in the corner, or 50% of a population that believes something false, as it became normalized.

The only people benefiting from those dark concepts are those with financial gains. They make money from it, and push the negatives to sell their products and cures. Those that fight against it, do not gain from it and it cost them time/money. That is why is a losing battle.

int_19h 7 hours ago | parent [-]

Whether something is normalized or not is mostly down to public opinion, not to censorship (whether by the government or by private parties).

Few countries have more restrictions on Nazi speech than Germany. And yet not only AfD is a thing, but it keeps growing.

drak0n1c 9 hours ago | parent | prev | next [-]

Read the article, along with this one https://reclaimthenet.org/google-admits-biden-white-house-pr...

In this case it wasn't a purely private decision.

rahidz 9 hours ago | parent | prev | next [-]

"Where's the limiting principle here?"

How about "If the content isn't illegal, then the government shouldn't pressure private companies to censor/filter/ban ideas/speech"?

And yes, this should apply to everything from criticizing vaccines, denying election results, being woke, being not woke, or making fun of the President on a talk show.

Not saying every platform needs to become like 4chan, but if one wants to be, the feds shouldn't interfere.

TeeMassive 9 hours ago | parent | prev [-]

> It's their private property, they can ban or promote any ideas that they want to. You're free to not use their property if you disagree with that.

1) They are public corporations and are legal creation of the state and benefit from certain protections of the state. They also have privileged access to some public infrastructures that other private companies do not have.

2) By acting on the behest of the government they were agent of the government for free speech and censorship purposes

3) Being monopolies in their respective markets, this means they must respect certain obligations the same way public utilities have.

hash872 8 hours ago | parent [-]

Re: 1- one certain protection of the state that they benefit from is the US Constitution, which as interpreted so far forbids the government to impair their free speech rights. Making a private actor host content they personally disagree with violates their right of free speech! That's what the 1st Amendment is all about

2. This has already been adjudicated and this argument lost https://en.wikipedia.org/wiki/Murthy_v._Missouri

3. What market is Youtube a monopoly in?

themaninthedark an hour ago | parent [-]

2. This has already been adjudicated and this argument lost https://en.wikipedia.org/wiki/Murthy_v._Missouri

The 6–3 majority determined that neither the states nor other respondents had standing under Article III, reversing the Fifth Circuit decision.

In law, standing or locus standi is a condition that a party seeking a legal remedy must show they have, by demonstrating to the court, sufficient connection to and harm from the law or action challenged to support that party's participation in the case.

Justice Amy Coney Barrett wrote the opinion, stating: "To establish standing, the plaintiffs must demonstrate a substantial risk that, in the near future, they will suffer an injury that is traceable to a government defendant and redressable by the injunction they seek. Because no plaintiff has carried that burden, none has standing to seek a preliminary injunction."

The Supreme Court did not say that what was done was legal, they only said that the people who were asking for the injunction and bringing the lawsuit could not show how they were being or going to be hurt.

aesthethiccs 3 hours ago | parent | prev | next [-]

Yes we should be allowed to bully idiots into the ground.

ants_everywhere 8 hours ago | parent | prev | next [-]

These policies were put in place because the anti-vax and election skepticism content was being promoted by military intelligence organizations that were trying to undermine democracy and public healthy in the US.

The US military also promoted anti-vax propaganda in the Philippines [0].

A lot of the comments here raise good points about silencing well meaning people expressing their opinion.

But information warfare is a fundamental part of modern warfare. And it's effective.

An American company or individual committing fraud can be dealt with in the court system. But we don't yet have a good remedy for how to deal with a military power flooding social media with information intentionally designed to mislead and cause harm for people who take it seriously.

So

> I think we have to realize silencing people doesn't work

it seems to have been reasonably effective at combating disinformation networks

> It just causes the ideas to metastasize

I don't think this is generally true. If you look at old disinformation campaigns like the idea that the US faked the moon landings, it's mostly confined to a small group of people who are prone to conspiracy thinking. The idea of a disinformation campaign is you make it appear like a crazy idea has broad support. Making it appear that way requires fake accounts or at least boosters who are in on the scheme. Taking that away means the ideas compete on their own merit, and the ideas are typically real stinkers.

[0] https://www.btimesonline.com/articles/167919/20240727/u-s-ad...

tonfreed 10 hours ago | parent | prev | next [-]

The best disinfectant is sunlight. I'm similarly appalled by some of the behaviour after a certain political activist was murdered, but I don't want them to get banned or deplatformed. I'm hoping what we're seeing here is a restoration of the ability to disagree with each other

tzs 7 hours ago | parent | next [-]

> The best disinfectant is sunlight

Have you actually tried to shine sunlight on online misinformation? If you do you will quickly find it doesn't really work.

The problem is simple. It is slower to produce factually correct content. A lot slower. And when you do produce something the people producing the misinformation can quickly change their arguments.

Also, by the time you get your argument out many of the people who saw the piece you are refuting and believed it won't even see your argument. They've moved on to other topics and aren't going to revisit that old one unless it is a topic they are particularly interested in. A large number will have noted the original misinformation, such as some totally unsafe quack cure for some illness that they don't currently have, accepted it as true, and then if they ever find themselves with that illness apply the quack cure without any further thought.

The debunkers used to have a chance. The scammers and bullshitters always had the speed advantage when it came to producing content but widespread distribution used to be slow and expensive. If say a quack medical cure was spreading the mainstream press could ask the CDC or FDA about it, talk to researchers, and talk to doctors dealing with people showing up in emergency rooms from trying the quack cure, and they had the distribution networks to spread this information out much faster than the scammers and bullshitters.

Now everyone has fast and cheap distribution through social media, and a large number of people only get their information from social media and so the bullshitters and scammers now have all the advantages.

LeafItAlone 9 hours ago | parent | prev | next [-]

>The best disinfectant is sunlight.

Is it? How does that work at scale?

Speech generally hasn’t been restricted broadly. The same concepts and ideas removed from YouTube still were available on many places (including here).

Yet we still have so many people believing falsehoods and outright lies. Even on this very topic of COVID, both sides present their “evidence” and and truly believe they are right, no matter what the other person says.

TeeMassive 9 hours ago | parent [-]

What's your alternative? The opposite is state dictated censorship and secrecy and those have turned very wrong every single time.

LeafItAlone 9 hours ago | parent [-]

I honestly don’t know. My libertarian foundation want me to believe that any and all ideas should be able to be spread. But with the technological and societal changes in the past 10-15 years, we’ve seen how much of a danger this can be too. A lie or mistrust can be spread faster than ever to a wider audience than previously ever possible. I don’t have solution, but what we have not is clearly not working.

api 8 hours ago | parent [-]

The root problem is that people don’t trust authorities. Why? Because they burned that trust.

People don’t believe the scientific consensus on vaccines because there were no WMDs in Iraq, to give one of many huge examples.

“But those were different experts!”

No they weren’t. Not to the average person. They were “the authorities,” and “the authorities” lied us into a trillion dollar war. Why should anyone trust “the authorities” now?

Tangentially… as bad as I think Trump is, he’s still not as bad as George W Bush in terms of lasting damage done. Bush II was easily the worst president of the last 100 years, or maybe longer. He is why we have a president Trump.

DangitBobby 8 hours ago | parent | prev | next [-]

And not letting the disease spread to begin with is better than any disinfectant.

slater- 9 hours ago | parent | prev | next [-]

>> The best disinfectant is sunlight.

Trump thought so too.

thrance 9 hours ago | parent | prev [-]

How's that working out? The worst ideas of the 20th century are resurfacing in plain sunlight because the dem's couldn't pluck their heads out of the sand and actually fight them.

Now vaccines are getting banned and the GOP is gerrymandering the hell out of the country to ensure the end of the democratic process. Sure, let's do nothing and see where that brings us. Maybe people will magically come to their senses.

andrewmcwatters 9 hours ago | parent [-]

Well, people literally died. So, I think we all know how it played out.

The same thing since time eternal will continue to occur: the educated and able will physically move themselves from risk and others will suffer either by their own volition, or by association, or by lot.

dawnerd 8 hours ago | parent | prev | next [-]

It also turns into a talking point for them. A lot of these weird conspiracies would have naturally died out if some people didn’t try to shut them down so much.

lkey 9 hours ago | parent | prev | next [-]

I think you are granting false neutrality to this speech. These misinfo folks are always selling a cure to go with their rejection of medicine. It's a billion dollar industry built off of spreading fear and ignorance, and youtube doesn't have any obligation to host their content. As an example, for 'curing' autism, the new grift is reject Tylenol and buy my folic acid supplement to 'fix' your child. Their stores are already open and ready.

lkey 9 hours ago | parent | next [-]

To finish the thought, scientists at the CDC (in the before times) were not making money off of their recommendations, nor were they making youtube videos as a part of their day job. There's a deep asymmetry here that's difficult to balance if you assume the premise that 'youtube must accept every kind of video no matter what, people will sort themselves out'. Reader, they will not.

mvdtnz 9 hours ago | parent | prev [-]

And silencing these people only lends credence to their "they don't want you to know this" conspiracy theories. Because at that point it's not a theory, it's a proven fact.

lkey 9 hours ago | parent [-]

These people will claim they were 'silenced' regardless. Even as they appear with their published bestseller about being silenced on every podcast and news broadcast under the sun, they will speak of the 'conspiracy' working against them at every step. The actual facts at hand almost never matter. Even at a press conference where the President is speaking on your behalf they'll speak of the 'groups' that are 'against' them, full of nefarious purpose. There is no magical set of actions that changes the incentive they have to lie, or believe lies. (except regulation of snake oil, which is not going to happen any time soon)

mvdtnz 9 hours ago | parent [-]

And most people roll their eyes and don't believe it. Which is why it's a good idea not to make it true.

lkey 8 hours ago | parent [-]

Conspiratorial thinkers are more likely to believe that Osama Bin Laden was already dead and is still alive rather than the official narrative that he was killed on the day reported. https://www.researchgate.net/publication/235449075_Dead_and_...

In general, you can't argue or 'fact' people out of beliefs they were not argued into. The best you can do is give them a safe place to land when disconfirmation begins. Don't be too judgy, no one is immune to propaganda.

deegles 9 hours ago | parent | prev | next [-]

no, letting misinformation persist is counterproductive because of the illusory truth effect. the more people hear it, the more they think (consciously or not) "there must be something to this if it keeps popping up"

NullCascade 9 hours ago | parent [-]

Elon Musk's takeover of X is already a good example of what happens with unlimited free speech and unlimited reach.

Neo-nazis and white nationalists went from their 3-4 replies per thread forums, 4chan posts, and Telegram channels, to now regularly reaching millions of people and getting tens of thousands of likes.

As a Danish person I remember how American media in the 2010s and early 2020s used to shame Denmark for being very right-wing on immigration. The average US immigration politics thread on X is worse than anything I have ever seen in Danish political discussions.

vkou 10 hours ago | parent | prev | next [-]

> But I think we have to realize silencing people doesn't work.

We also tried letting the propaganda machine full-blast those lies on the telly for the past 5 years.

For some reason, that didn't work either.

What is going to work? And what is your plan for getting us to that point?

_spduchamp 8 hours ago | parent [-]

Algorithmic Accountability.

People can post all sorts of crazy stuff, but the algorithms do not need to promote it.

Countries can require Algorithmic Impact Assements and set standards of compliance to recommended guidelines.

amanaplanacanal 6 hours ago | parent [-]

This seems unlikely to be constitutional in the US.

bencorman 8 hours ago | parent | prev | next [-]

I wish someone could have seen the eye roll I just performed reading this comment.

Silencing absolutely works! How do you think disinformation metastasized!?

heavyset_go 9 hours ago | parent | prev | next [-]

When the pogroms[1] start, it will be a luxury to let it ride out so you can roll your eyes at it.

There's a reason you don't fan the flames of disinformation. Groups of people cannot be reasoned with like you can reason with an individual.

[1] https://systemicjustice.org/article/facebook-and-genocide-ho...

krapp 8 hours ago | parent | prev | next [-]

>A lot of people will say all kinds of craziness, and you just have to let it ride so most of us can roll our eyes at it.

Except many people don't roll their eyes at it, that's exactly the problem. QAnon went from a meme on 4chan to the dominant political movement across the US and Europe. Anti-vax went from fringe to the official policy position of the American government. Every single conspiracy theory that I'm aware of has only become more mainstream, while trust in any "mainstream" source of truth has gone down. All all of this in an environment of aggressive skepticism, arguing, debating and debunking. All of the sunlight is not disinfecting anything.

We're literally seeing the result of the firehose of misinformation and right-wing speech eating people's brains and you're saying we just have to "let it ride?"

Silencing people alone doesn't work, but limiting the damage misinformation and hate speech can do while pushing back against it does work. We absolutely do need to preserve the right of platforms to choose what speech they spread and what they don't.

benjiro 9 hours ago | parent | prev | next [-]

Funny thing, several person that counter responded and disagreed got grayed out (aka negative downvoted ... as in censored).

Reality is, i have personally seen what this type of uncontrolled anti-vax stuff does. The issue is that its harder to disprove a negative with a positive, then people realize.

The moment you are into the youtube, tiktok or whatever platform algorithm, you are fed a steady diet of this misinformation. When you then try to argue with actual factual studies, you get the typical response from "they already said that those studies are made up"... How do you fight that? Propaganda works by flooding the news and over time, people believe it.

That is the result of uncensored access because most people do not have the time to really look up a scientific study. The amount of negative channels massive out way positive / fact based channels because the later is "boring". Its the same reason why your evening news is 80% deaths, corruptions, thefts, politicians and taxes or other negative world news. Because it has been proven that people take in negative news much more. Clickbait titles that are negative draw in people.

There is a reason why holocaust denial is illegal in countries. Because the longer some people can spew that, the more people actually start to believe it.

Yes, i am going to get roasted for this but people are easily influenced and they are not as smart as they think themselves are. We have platforms that cater to people's short attention span with barely 1~3 min clips. Youtube video's longer then 30min are horrible for the youtubers income as people simply do not have the attention span and resulting lost income.

Why do we have laws like seatbelt, speed limits, and other "control" over people. Because people left to their own devices, can be extreme uncaring about their own family, others, even themselves.

Do i like the idea of censorship for the greater good, no. But when there are so many that spew nonsense just to sell their powders, and their homemade vitamine C solutions (made in China)... telling people information that may hurt or kills themselves, family or others.

Where is the line of that unbridled free speech? Silencing people works as in, your delaying the flow of shit running down a creek, will it stop completely? No, but the delay helps people downstream. Letting it run uninterrupted, hoping that a few people downstream with a mop will do all the work, yea ...

We only need to look at platforms like X, when "censorship" got removed (moderation). Full of free speech, no limits and it turned into a soak pit extreme fast (well, bigger soak pit).

Not sure why i am writing this because this is a heated topic but all i can say is, I have seen the damage that anti-vax did on my family. And even to this day, that damage is still present. How a person who never had a issue with vaccinations, never had a bad reaction beyond the sour arm for a day, turned so skeptical to everything vaccination. All because those anti-vax channels got to her.

The anti-vax movement killed people. There is scientific study upon study how red states in the US ended up with bigger amounts of deaths given the time periodes. And yet, not a single person was ever charged for this, ... all simply accepted this and never looked back. Like it was a natural thing, that people's grandparents, family members died that did not need to die.

The fact that people have given up, and now accept to let those with often financial interests, spew nonsense as much as they like. Well, its "normal".

I weep for the human race because we are not going to make it.

breadwinner 9 hours ago | parent | prev | next [-]

> silencing people doesn't work

I agree, but how do you combat propaganda from Putin? Do you match him dollar for dollar? I am sure YouTube would like that, but who has deep enough pockets to counter the disinformation campaigns?

Similar issue with Covid... when you are in the middle of a pandemic, and dead bodies are piling up, and hospitals are running out of room, how do you handle misinformation spreading on social media?

JumpCrisscross 9 hours ago | parent | next [-]

Slow down our algorithmic hell hole. Particularly around elections.

LeafItAlone 9 hours ago | parent | next [-]

>Slow down our algorithmic hell hole.

What are your suggestions on accomplishing this while also bent compatible with the idea that government and big tech should not control ideas and speech?

JumpCrisscross 9 hours ago | parent | next [-]

> What are your suggestions on accomplishing this while also bent compatible with the idea that government and big tech should not control ideas and speech?

Time delay. No content based restrictions. Just, like, a 2- to 24-hour delay between when a post or comment is submitted and when it becomes visible, with the user free to delete or change (in this case, the timer resets) their content.

I’d also argue for demonetising political content, but idk if that would fly.

LeafItAlone 9 hours ago | parent [-]

Ok, but how does that get implemented? Not technically, but who makes it happen and enforces the rules? For all content or just “political”? Who decides what’s “political”? Information about the disease behind a worldwide pandemic isn’t inherently “political”, but somehow it became so.

Who decides agar falls in this bucket. The government? That seems to go against the idea of restricting speech and ideas.

JumpCrisscross 9 hours ago | parent [-]

> who makes it happen and enforces the rules?

Congress for the first. Either the FCC or, my preference, private litigants for the second. (Treble damages for stupid suits, though.)

> For all content or just “political”?

The courts can already distinguish political speech from non-political speech. But I don’t trust a regulator to.

I’d borrow from the French. All content within N weeks of an in the jurisdiction. (I was going to also say any content that mentions an elected by name, but then we’ll just get meme names and nobody needs that.)

Bonus: electeds get constituent pressure to consolidate elections.

Alternative: these platforms already track trending topics. So an easy fix is to slow down trending topics. It doesn’t even need to be by that much, what we want is for people to stop and think and have a chance to reflect on what they do, maybe take a step away from their device while at it.

breadwinner 8 hours ago | parent | prev [-]

Easy solution: Repeal Section 230.

Allow citizens to sue social media companies for the harm caused to them by misinformation and disinformation. The government can stay out of this.

JumpCrisscross 8 hours ago | parent [-]

> Easy solution: Repeal Section 230

May I suggest only repealing it for companies that generate more than a certain amount of revenue from advertising, or who have more than N users and have algorithmic content elevation?

breadwinner 8 hours ago | parent [-]

That seems like a reasonable middle ground.

breadwinner 9 hours ago | parent | prev [-]

If the government asks private companies to do that, then that's a violation of 1st amendment, isn't it?

This is the conundrum social media has created. In the past only the press, who were at least semi-responsible, had the ability to spread information on a massive scale. Social media changed that. Now anyone can spread information instantly on a massive scale, and often it is the conspiracy theories and incorrect information that people seek out.

"We were a bit naive: we thought the internet, with the availability of information, would make us all a lot more factual. The fact that people would seek out—kind of a niche of misinformation—we were a bit naive." -- Bill Gates to Oprah, on "AI and the Future of us".

JumpCrisscross 9 hours ago | parent [-]

> If the government asks private companies to do that, then that's a violation of 1st amendment, isn't it?

Yes. An unfortunate conclusion I’m approaching (but have not reached, and frankly don’t want to reach) is the First Amendment doesn’t work in a country that’s increasingly illiterate and addicted to ad-powered algorithmic social media.

breadwinner 8 hours ago | parent [-]

It is social media that is the root problem.

On the internet everything can appear equally legitimate. Breitbart looks as legit as the BBC. Sacha Baron Cohen https://www.youtube.com/watch?v=ymaWq5yZIYM

Excerpts:

Voltaire was right when he said "Those who can make you believe absurdities can make you commit atrocities." And social media lets authoritarians push absurdities to millions of people.

Freedom of speech is not freedom of reach. Sadly There will always be racists, misogynists, anti-Semites, and child abusers. We should not be giving bigots and pedophiles a free platform to amplify their views and target their victims.

Zuckerberg says people should decide what's credible, not tech companies. When 2/3rds of millennials have not heard of Auschwitz how are they supposed to know what's true? There is such a thing as objective truth. Facts do exist.

altruios 9 hours ago | parent | prev | next [-]

Censorship is a tool to combat misinformation.

It's taking a sword to the surgery room where no scalpel has been invented yet.

We need better tools to combat dis/mis-information.

I wish I knew what that tool was.

Maybe 'inoculating information' that's specifically stickier than the dis/mis-info?

breadwinner 8 hours ago | parent [-]

Easy solution: Repeal Section 230.

Social media platforms in the United States rely heavily on Section 230 of the Communications Decency Act, which provides them immunity from liability for most user-generated content.

DangitBobby 8 hours ago | parent [-]

This would cause widespread censorship of anything remotely controversial, including the truth. We'd be in a "censor first, ask questions later" society. Somehow that doesn't seem healthy either.

breadwinner 7 hours ago | parent [-]

Have you visited nytimes.com in recent months? Just this morning the top headline was about the lies Trump told at the UN. That's pretty controversial - the newspaper of record calling the sitting president a liar. That's not allowed in many or most countries, but it is allowed in the US. And Trump is suing New York Times for $15 billion, for defamation. That didn't intimidate NYT. They are willing to stand behind the articles they publish. If you can't stand behind what you publish, don't publish them.

DangitBobby 7 hours ago | parent [-]

Publishing your own story is not the same thing.

TeeMassive 9 hours ago | parent | prev [-]

Have you heard about Tik Tok? And you think governments' intelligence agencies are not inserting their agents in key positions at bit tech companies?

breadwinner 8 hours ago | parent | prev [-]

The government created this problem when they enacted Section 230. This is at the root of the misinformation and disinformation... social media companies are not responsible for the harm.

The simple solution is repeal Section 230. When information can be transmitted instantly on a massive scale, somebody need to responsible for the information. The government should not police information but citizens should be allowed to sue social media companies for the harm caused to them.

int_19h 7 hours ago | parent [-]

The practical end result of repealing Section 230 is that companies will crack down on any even remotely controversial speech because that's the only way to avoid lawsuits.

breadwinner 7 hours ago | parent [-]

The New York Times has published plenty of stories you could call controversial. Just this morning the top headline was that Trump lied at the UN. Trump has sued the Times for defamation, yet the paper stands by its reporting. That’s how publishing works: if you can’t defend what you publish, don’t publish it. The Section 230 debate is about whether large online platforms such as Facebook should bear similar accountability for the content they distribute. I think they should. That's the only way we can control misinformation and disinformation.

cactusplant7374 10 hours ago | parent | prev | next [-]

> From President Biden on down, administration officials “created a political atmosphere that sought to influence the actions of platforms based on their concerns regarding misinformation,” Alphabet said, claiming it “has consistently fought against those efforts on First Amendment grounds.”

This actually surprised me because I thought (and maybe still think) that it was Google employees that led the charge on this one.

softwaredoug 10 hours ago | parent | next [-]

It's in their interests now to throw Biden under the bus. There may be truth to this, but I'm sure its exaggerated for effect.

nitwit005 7 hours ago | parent [-]

Worth noting that Trump directly threatened to put Zuckerberg in prison for life in relation to this: https://www.cnn.com/2024/08/31/politics/video/smr-trump-zuck...

I wouldn't trust any public statement from these companies once that kind of threat has been thrown around. People don't exactly want to go to prison forever.

HankStallone 10 hours ago | parent | prev | next [-]

It was. At the time, they felt like they were doing the right thing -- the heroic thing, even -- in keeping dangerous disinformation away from the public view. They weren't shy about their position that censorship in that case was good and necessary. Not the ones who said it on TV, and not the ones who said it to me across the dinner table.

For Google now to pretend Biden twisted their arm is pretty rich. They'd better have a verifiable paper trail to prove that, if they expect anyone with a memory five years long to believe it.

dotnet00 10 hours ago | parent [-]

To be fair, even if they were being honest about Biden twisting their arm (I don't buy it), the timing makes it impossible to believe their claim.

CSMastermind 8 hours ago | parent [-]

Why wouldn't you buy it?

The Twitter files showed direct communications from the administration asking them ban specific users like Alex Berenson, Dr. Martin Kulldorff, and Dr. Andrew Bostom: https://cbsaustin.com/news/nation-world/twitter-files-10th-i...

Meta submitted direct communications from the administration pressuring them to ban people as part of a congressional investigation: https://www.aljazeera.com/news/2024/8/27/did-bidens-white-ho...

It would be more surprising if they left Google alone.

dotnet00 8 hours ago | parent | next [-]

The implication of saying they were "pressed" by the Biden admin (as they claim in the letter) is that Google was unwilling. I don't buy that. They were complicit and are now throwing the Biden admin under the bus because it is politically convenient. Just like how the Twitter files showed that Twitter was complicit in it.

db48x 7 hours ago | parent [-]

Well of course they’re going to say that they resisted doing the bad thing, even though they still did the bad thing. All it took to get them to do the bad thing was for someone to ask them to do it, but they really resisted as hard as they could, honest.

braiamp 6 hours ago | parent [-]

Note, that in their letter they carefully avoided mention what happened during Trump 1.0 administration. Their policies started before Biden was president, so this is 100% throwing Biden admin under the bus.

db48x 6 hours ago | parent [-]

Do you think that they should omit those facts? That they should fail to mention that the Biden administration used them to censor Americans?

dotnet00 6 hours ago | parent [-]

The language they're using implies the Biden admin pressured them to censor (which, as pointed out, doesn't make sense because they were doing it before Biden too), rather than just admitting that they were complicit with the Biden admin to do it.

db48x 4 hours ago | parent [-]

Yea, but we can see through their self-serving language. The fact is they decided on a policy of banning “misinformation” that the Biden administration turned into a censorship machine. One is misguided, the other is a crime.

dotnet00 3 hours ago | parent [-]

The 1st amendment doesn't prevent the government from making suggestions to private companies. They aren't allowed to coerce them into censoring things. So it still isn't a crime.

What the Biden admin did was not acceptable, and even at the time I got plenty of heat from HN for thinking that it was a sketchy loophole for the government to use, that it was against the spirit of the law.

I'm trying to emphasize the distinction because the companys' self-serving language is going to be abused to claim that the current admin - that has just threatened to sue a TV channel for bringing back a show they tried to threaten the channel into getting rid of - is actually a defender of free speech.

braiamp 8 hours ago | parent | prev [-]

If you read those documents, you will see that the administration was telling them that those accounts were in violation of Twitter TOS. They simply said "hey, this user is violating your TOS, what are you gonna do about it?", and Twitter simply applied their rules.

db48x 7 hours ago | parent [-]

That was after they had changed the TOS to make it against the rules to talk about certain topics, such as gain of function research at Chinese labs that was funded by researchers that were themselves funded by the US government.

braiamp 6 hours ago | parent [-]

Which is still a debunked theory. Nobody created SARS-CoV-2 for nefarious purposes. The best theory we have is that there was a failure in the contention. But people pushing for that theory wanted to have a conspiration instead, when plain human failures explain everything.

db48x 6 hours ago | parent [-]

I never said that it was created for nefarious purposes. That was you projecting or creating a straw man to attack.

braiamp 5 hours ago | parent [-]

What part of "people pushing for that theory wanted to have a conspiration instead" was missed? I don't care what you think it happened, I just don't want to hear more conspiration. I'm tired of that. We are humans, therefore, we are stupidly imperfect creatures. There isn't anything to learn about the event, other than humans gonna human.

tbrownaw 3 hours ago | parent | next [-]

> There isn't anything to learn about the event, other than humans gonna human.

US money wasn't supposed to be used to fund that kind of research. So people violated policy and evaded detection until the leak happened. How? Who? Would different audit controls have helped?

The was a cover-up after the fact. Again, how did it work and who was involved? What could have made it less effective?

The lab accident itself is the least interesting part, it's all the bureaucratic stuff that really matters. For boring generic bureaucratic-effectiveness reasons, not any "someone tried to do a bioweapon" silliness.

db48x 3 hours ago | parent | prev [-]

Except that a lot of what was banned were _not_ conspiracy theories. The truth is that the NIH _did_ fund gain of function research and that research _was_ conducted at the Wuhan Institute of Virology. Those are the facts that the government worked so hard to suppress our knowledge of. And they were able to use Google’s policies of suppressing “misinformation” to do it for several years.

frollogaston 6 hours ago | parent | prev [-]

It's been known for years that the White House was pressuring Google on this. One court ordered them to cease temporarily. I wanted to link the article, but it's hard to find because of the breaking news.

whinvik 3 hours ago | parent | prev | next [-]

Its odd. People on HN routinely complain how Stripe or PayPal or some other entity banned them unfairly and the overwhelming sentiment is that it was indeed unfair.

But when it comes to this thread, the sentiment mostly is banning is good and we should trust Google made the right choice.

system7rocks 8 hours ago | parent | prev | next [-]

We live in a complicated world, and we do need the freedom to get things right and wrong. Never easy though in times of crisis.

Silver lining in this is the conversation continued and will continue. I can see governments needing to try to get accurate and helpful information out in crisis - and needing to pressure or ask more of private companies to do that. But also like that we can reflect back and go - maybe that didn’t work like what we wanted or maybe it was heavy-handed.

In many governments, the government can do no wrong. There are no checks and balances.

The question is - should we still trust YouTube/Google? Is YouTube really some kind of champion of free speech? No. Is our current White House administration a champion of free speech? Hardly.

But hopefully we will still have a system that can have room for critique in the years to come.

electriclove 8 hours ago | parent | next [-]

It is scary how close we were to not being able to continue the conversation.

type0 8 hours ago | parent | prev [-]

> Is our current White House administration a champion of free speech? Hardly.

So after January 22 2026, US leaves WHO and youtube users will be able to contradict WHO recommendations

lesuorac 11 hours ago | parent | prev | next [-]

2 years is a pretty long ban for a not even illegal conduct.

Although if they got banned during the start of covid during the Trump administration then we're talking about 5 years.

asadotzler 8 hours ago | parent | next [-]

No one owes them any distribution at all.

zug_zug 8 hours ago | parent | next [-]

Absolutely. Especially when those election deniers become insurrectionists.

beeflet 6 hours ago | parent | prev [-]

that is a two-way street

Simulacra 10 hours ago | parent | prev [-]

They went against a government narrative. This wasn't Google/Youtube banning so much as government ordering private companies to do so.

LeafItAlone 9 hours ago | parent | next [-]

And do you think the impetuous behind this action happening now is any different? In both cases YouTube is just doing what the government wants.

JumpCrisscross 9 hours ago | parent | prev [-]

> wasn't Google/Youtube banning so much as government ordering private companies to do so

No, it was not. It’s particularly silly to suggest this when we have live example of such orders right now.

The companies were nudged. (And they were wrong to respond to public pressure.) The President, after all, has a “bully pulpit.” But there were no orders, no credibly threats and plenty of companies didn’t deplatform these folks.

EasyMark 3 hours ago | parent | next [-]

That's what I told my MAGA friends. Biden recommended stuff, Trump threatens stuff. So far only one of them has followed through with action. Trump has threatened business deals and prosecution, and is currently sending government after his opponents with the DoJ. Yet those same people are as quiet as mice now on "government bullying"

spullara 9 hours ago | parent | prev | next [-]

They literally had access to JIRA at Twitter so they could file tickets against accounts.

JumpCrisscross 9 hours ago | parent | next [-]

> literally had access to JIRA at Twitter so they could file tickets against accounts

I’m not disputing that they coördinated. I’m challenging that they were coerced.

We wouldn’t describe Fox News altering a script on account of a friendly call from Miller and friends the “government ordering private companies” around. (Or, say, Florida opening their criminal justice records to ICE the federal government ordering states around.) Twitter’s leadership and the Biden administration saw eye to eye. This is a story about a media monoculture and private censorship, not government censorship.

unethical_ban 8 hours ago | parent | prev [-]

Do you think no nefarious nation state actors are on social media spinning disinformation?

EasyMark 3 hours ago | parent [-]

It's extremely obvious on twitter. blue check accounts that post every few minutes 24/7 with profiles that say stuff like "true believer, wife, lover, seeker-of-truth. Don't DM me, I don't answer" . They are on there in the hundreds of thousands.

starik36 9 hours ago | parent | prev [-]

That was certainly the case with Twitter. It came out during the congressional hearings. FBI had a direct line to the decision makers.

brokencode 9 hours ago | parent | next [-]

A direct line to threaten decision makers? Or to point out possible misinformation spreaders?

starik36 5 hours ago | parent [-]

Threaten. Because of the implication.

JumpCrisscross 4 hours ago | parent | next [-]

> Because of the implication

What was the implication? Twitter had no business in front of the federal government. They were wilfully complying.

That doesn't make it okay. But it's a total retconning of actual history to suggest this was government censorship in any form.

brokencode 3 hours ago | parent | prev | next [-]

Do you think Elon would ever shut up about it if he was getting threatened by Biden and the FBI?

dotnet00 3 hours ago | parent | prev [-]

Musk owned Twitter for years of the Biden admin, at least one year of that was him openly simping for Trump.

So... what sort of threat was this, that suddenly disappeared when Musk bought it? How credible was the threat if Musk was able to release the Twitter Files without repercussions from the Biden admin?

JumpCrisscross 9 hours ago | parent | prev [-]

> was certainly the case with Twitter

It was not. No threats were made, and Twitter didn’t blindly follow the FBI’s guidance.

The simple truth is the leftist elements that wanted to control the debate were there in the White House and in Twitter’s San Francisco offices. Nobody had to be coerced, they were coördinating.

bluedino 9 hours ago | parent | prev | next [-]

I'm banned from posting in a couple subreddits for not aligning with the COVID views of the moderators. Lame.

c-hendricks 9 hours ago | parent | next [-]

Whenever someone says "i was banned from ..." take what they say with a huge grain of salt.

int_19h 7 hours ago | parent | next [-]

On Reddit, you can get banned from some subreddits simply because you have posted in another completely different sub (regardless of the content of the post).

It's not even always politics, although that's certainly a major driving force. But then you have really stupid fights like two subs about the same topic banning each others' members.

qingcharles 6 hours ago | parent | prev | next [-]

The problem (?) with Reddit is that the users themselves have a lot more control over bans than on other social media where it is the platform themselves that do the banning. This makes bans much more arbitrary even than on Facebook and et al.

pinkmuffinere 8 hours ago | parent | prev | next [-]

Everybody here is strangers online, so I think grains of salt are reasonable all around. That said, I'm not sure that people-who-were-banned deserve above average scrutiny. Anecdotally, a lot of the RubyGems maintainers were banned a week ago. It seems really unfair to distrust people _just_ because a person-in-control banned them.

c-hendricks 6 hours ago | parent [-]

Oof, I'm outside my edit window and didn't make my correct point. It's when people say "I was banned from _____ for _____". When people say "for _____" I take their word with a huge grain of salt.

Not even much to do with Reddit, it's something I picked up from playing video games: https://speculosity.wordpress.com/2014/07/28/the-lyte-smite/

pinkmuffinere 3 hours ago | parent [-]

Ah I see, you’re saying it’s very hard/impossible to verify the reason for the ban, so the given reason is especially low-signal. That actually does make sense to me, thanks for clarifying

EasyMark 3 hours ago | parent | prev | next [-]

I was banned because I was simply in a covid sub debating with the covid-deniers. The "powers-that-be" mods literally banned anyone on that particular sub from popular subs, some of which I hadn't even been in, ever. There was (is?) a cabal of mods on there that run the most popular subs like pics/memes/etc that definitely are power hungry basement dwellers that must not have a life.

mvdtnz 9 hours ago | parent | prev | next [-]

Reddit (both admins and many subreddit moderators) are extremely trigger happy with bans. Plenty of reasonable people get banned by capricious Reddit mods.

Loocid 7 hours ago | parent | prev | next [-]

Eh, I was banned from several major subreddits for simply posting in a conversative subreddit, even though my post was against the conservative sentiment.

c-hendricks 5 hours ago | parent [-]

Same, happened to me after replying to a comment in the JRE sub, I think I was calling something / someone dumb. Coincidentally, that sub is openly against him now.

Tried clarifying this in another comment, my point was more that people who say "I was banned from X for doing something innocuous" are often not telling the whole truth.

tbrownaw 3 hours ago | parent [-]

> my point was more that people who say "I was banned from X for doing something innocuous" are often not telling the whole truth.

... Except when the X in question is Reddit.

alex1138 8 hours ago | parent | prev [-]

Stop excusing it. It's a very real, very serious problem with Reddit. They're very much abusive on this and many other topics

frollogaston 5 hours ago | parent [-]

The answer is to leave Reddit and let them have their echo chamber. There's no point of posting there anyway.

croes 7 hours ago | parent | prev [-]

I was banned because a moderator misunderstood my single word answer to another post.

Reddit bans aren‘t an indicator for anything

rob74 an hour ago | parent | prev | next [-]

> Google's move to reinstate previously banned channels comes just over a year after Meta CEO Mark Zuckerberg said [...] that the Biden administration had repeatedly pressured Meta in 2021 to remove content related to COVID-19. "I believe the government pressure was wrong, and I regret that we were not more outspoken about it," Zuckerberg wrote in the August 2024 letter.

I'm sure Zuckerberg will say the same thing in 2029 too if the ruling party changes again. Until then, removing fact-checking and letting conspiracy theorists have their freedom of speech while suppressing voices critical of the current administration will make that change less likely...

whycome 11 hours ago | parent | prev | next [-]

What exactly constituted a violation of a COVID policy?

PaulKeeble 10 hours ago | parent | next [-]

A lot of channels had to avoid even saying the word Covid. I only saw it return recently to use at the end of last year. There were a variety of channels banned that shouldn't have been such as some talking about Long Covid.

doom2 6 hours ago | parent [-]

Now you see channels avoiding saying "Gaza" or "genocide". I haven't seen any proof platforms are censoring at least some content related to Israel but I wouldn't be surprised.

perihelions 11 hours ago | parent | prev | next [-]

According to Google's censorship algorithm, Michael Osterholm's podcast (famous epidemiologist and, at the time, a member of President Biden's own gold-star covid-19 advisory panel).

https://x.com/cidrap/status/1420482621696618496 ("Our Osterholm Update podcast episode (Jul 22) was removed for “medical misinformation.”" (2021))

Most ironic thing I've ever seen. I still recall it perfectly, though it's been four years. Never, ever trust censorship algorithms or the people who control them: they are just dumb parrots that suppress all discussion of an unwanted topic, without thought or reason.

delichon 10 hours ago | parent [-]

My wake up moment was when they not only took down a Covid debate with a very well qualified virologist, but also removed references to it in the Google search index, not just for the YouTube link.

barbacoa 9 hours ago | parent [-]

Google went so far as to scan people's private google drives for copies of the documentary 'plandemic' and delete them.

potsandpans 9 hours ago | parent [-]

Can you please provide evidence? I'm not saying I don't believe you. It's just... extraordinary claims etc

barbacoa 9 hours ago | parent [-]

https://reclaimthenet.org/google-drive-takes-down-user-file-...

tbrownaw 3 hours ago | parent [-]

The sounds like the particular file in question was set to public and being widely shared around.

Which is rather different than scanning actual private files.

potsandpans 9 hours ago | parent | prev | next [-]

Saying lab leak was true

carlosjobim 11 hours ago | parent | prev [-]

Every opinion different from the opinion of "authorities". They documented it here:

https://blog.youtube/news-and-events/managing-harmful-vaccin...

From the two links in the post, Google fleshes it out in great detail, with many examples of forbidden thought.

ggm 6 hours ago | parent | prev | next [-]

Without over-doing it, as a non-american, not resident in the USA, It is so very tempting to say "a problem of your making" -but in truth, we all have a slice of this because the tendency to conduct state policy by mis-truths in the media is all-pervasive.

So yes. This is a problem rooted in the USA. But it is still a problem, and it's a problem for everyone, everywhere, all the time.

st-keller an hour ago | parent | prev | next [-]

More speech! The signal vs. noise-ratio shifts. So access to information will become more difficult. More disinformation and outright nonsense will make it more difficult to get to the valuable stuff. Ok - let‘s see how that works!

petermcneeley 7 hours ago | parent | prev | next [-]

Arguing online about the merits of free speech is as paradoxical as having discussions about free will.

bromuro 5 hours ago | parent | prev | next [-]

YouTube is like old school televison - at different scale, they have to answer to politics and society. Our videos are their line up.

frollogaston 5 hours ago | parent | prev | next [-]

Still curious if the White House made them pin those vaccine videos on the homepage, then disable dislikes.

flohofwoe an hour ago | parent | prev | next [-]

Even more misinformation, Russian propaganda and bots to sift through in the recommendations and comments, got it!

pessimizer 11 hours ago | parent | prev | next [-]

Better article: https://www.businessinsider.com/youtube-reinstate-channels-b...

Actual letter: https://judiciary.house.gov/sites/evo-subsites/republicans-j...

Good editorial: https://www.businessinsider.com/google-meta-congress-letter-...

topspin 10 hours ago | parent | next [-]

All those words, and no mention of Section 230, which is what this is really all about. Google can see which way the wind is blowing and they know POTUS will -- for better or worse -- happily sign any anti-"Big Tech censorship" bill that gets to his desk. They hope to preempt this.

Yes, I know about the Charlie Kirk firings etc.

dang 9 hours ago | parent | prev | next [-]

Ok, we've changed the URL above to that first link from https://www.offthepress.com/youtube-will-let-users-booted-fo.... Thanks!

murphyslab 11 hours ago | parent | prev [-]

Two articles that I found offered a well-rounded analysis:

- https://www.engadget.com/big-tech/youtube-may-reinstate-chan...

- https://arstechnica.com/gadgets/2025/09/youtube-will-restore...

woeirua 10 hours ago | parent | prev | next [-]

It seems to me that a lot of people are missing the forest for the trees on misinformation and censorship. IMO, a single YouTube channel promoting misinformation, about Covid or anything else, is not a huge problem, even if it has millions of followers.

The problem is that the recommendation algorithms push their viewers into these echo chambers that are divorced from reality where all they see are these videos promoting misinformation. Google's approach to combating that problem was to remove the channels, but the right solution was, and still is today, to fix the algorithms to prevent people from falling into echo chambers.

CobrastanJorji 9 hours ago | parent | next [-]

Yeah, there are two main things here that are being conflated.

First, there's YouTube's decision of whether or not to allow potentially dangerous misinformation to remain on their site, and whether the government can or did require them to remove it.

Second, though, there's YouTube's much stronger editorial power: whether or not to recommend, advertise, or otherwise help people discover that content. Here I think YouTube most fairly deserves criticism or accolades, and it's also where YouTube pretends that the algorithm is magic and neutral and they cannot be blamed for actively pushing videos full of dangerous medical lies.

stronglikedan 9 hours ago | parent | prev | next [-]

The problem is that misinformation has now become information, and vice versa, so who was anyone to decide what was misinformation back then, or now, or ever.

I like the term disinformation better, since it can expand to the unfortunately more relevant dissenting information.

asadotzler 8 hours ago | parent | prev | next [-]

Why. Why is Google obligated to publish your content? Should Time Magazine also give you a column because they give others space in their pages? Should Harvard Press be required to publish and distribute your book because they do so for others.

These companies owe you nothing that's not in a contract or a requirement of law. That you think they owe you hosting, distribution, and effort on their algorithm, is a sign of how far off course this entire discourse has moved.

kypro 9 hours ago | parent | prev | next [-]

I've argued this before, but the algorithms are not the core problem here.

For whatever reason I guess I'm in that very rare group that genuinely watches everything from far-right racists, to communists, to mainstream media content, to science educational content, to conspiracy content, etc.

My YT feed is all over the place. The algorithms will serve you a very wide range of content if you want that, the issue is that most people don't. They want to hear what they already think.

So while I 100% support changing algorithms to encourage more diversity of views, also I think as a society we need to question why people don't want to listen to more perspectives naturally? Personally I get so bored here people basically echo what I think. I want to listen to people who say stuff I don't expect or haven't thought about before. But I'm in a very significant minority.

woeirua 9 hours ago | parent [-]

I might agree that the algos making recommendations on the sidebar might not matter much, but the algos that control which videos show up when you search for videos on Google, and also in YouTube search absolutely do matter.

theossuary 9 hours ago | parent | prev | next [-]

The problem with this is that a lot of people have already fallen into these misinformation echo chambers. No longer recommending them may prevent more from becoming unmoored from reality, but it does nothing for those currently caught up in it. Only removing the channel helps with that.

hsbauauvhabzb 9 hours ago | parent | next [-]

Algorithms that reverse the damage by providing opposing opinions could be implemented.

amanaplanacanal 6 hours ago | parent [-]

Why would Google ever do that? People are likely to leave YouTube for some other entertainment, and then they won't see more ads.

squigz 8 hours ago | parent | prev [-]

I don't think those people caught up in it are suddenly like "oop that YouTuber is banned, I guess I don't believe that anymore". They'll seek it out elsewhere.

int_19h 7 hours ago | parent | next [-]

If anything, these people see the removal of their "favorite" videos as validation - if a video is removed, it must be because it was especially truthful and THEY didn't like that...

theossuary 3 hours ago | parent | prev [-]

It's actually been showed many times that deplatforming significantly reduces the number of followers an influencer has. Many watch out of habit or convenience, but won't follow when they move to a platform with less moderation.

terminalshort 9 hours ago | parent | prev [-]

The algorithm doesn't push anyone. It just gives you what it thinks you want. If Google decided what was true and then used the algorithm to remove what isn't true, that would be pushing things. Google isn't and shouldn't be the ministry of truth.

woeirua 9 hours ago | parent | next [-]

Exactly, they shouldn't be the ministry of truth. They should present balanced viewpoints on both sides of controversial subjects. But that's not what they're doing right now. If you watch too many videos on one side of a subject it will just show you more and more videos reinforcing that view point because you're likely to watch them!

terminalshort 2 hours ago | parent [-]

Why should Youtube try to tell me what it thinks I should want to watch instead of what I actually want to watch? I'm not particularly interested in their opinion on that matter.

TremendousJudge 9 hours ago | parent | prev [-]

"what it thinks you want" is doing a lot of work here. why would it "think" that you want to be pushed into an echo chamber divorced from reality instead of something else? why would it give you exactly what you "want" instead of something aligned with some other value?

terminalshort 2 hours ago | parent [-]

Given the number of people that describes it's pretty clear that people do want that. It's not exactly a new and surprising thing that people want things that are bad for them.

rustystump 9 hours ago | parent | prev | next [-]

The problem with any system like this is that due to scale it will be automated which means a large swath of people will be caught up in it doing nothing wrong.

This is why perma bans are bad. Id rather a strike system before a temp ban to give some breathing room for people to navigate the inevitable incorrect automation. Even then if the copyright issue is anything to go by this is going to hurt more than help.

cavisne 5 hours ago | parent | prev | next [-]

They should bring back the content too. When history books are written the current state of things is misleading.

EasyMark 3 hours ago | parent | prev | next [-]

I'm not sure why they would, it's kind of a dumb move. They aren't violating anyone's freedom of speech by banning disinformation and lies. It's a public service, those people can head on over to one of the many outlets for that stuff. This is definitely a black mark on YouTube.

keeda 6 hours ago | parent | prev | next [-]

In other news (unrelated, I'm sure):

"DOJ aims to break up Google’s ad business as antitrust case resumes"

https://arstechnica.com/gadgets/2025/09/google-back-in-court...

lupusreal 8 hours ago | parent | prev | next [-]

Prediction, nobody will be unbanned because they'll all be found to have committed other bannable offenses. Youtube gives Trump a fake win while actually doing nothing.

pcdoodle 8 hours ago | parent | prev | next [-]

So great to see the censorship apparatus in full swing on HN. Lots of great comments into the dust bin.

jameslk 4 hours ago | parent | prev | next [-]

Misinformation, disinformation, terrorism, cancel culture, think of the children, fake news, national security, support our troops, and on and on. These will be used to justify censorship. Those who support it today may find out it's used against them tomorrow.

serf 9 hours ago | parent | prev | next [-]

i'd like to think that if I were a YTer that got banned for saying something that I believed in that I would at least have the dignity not to take my value back to the group that squelched me.

..but i'm not a yter.

TeMPOraL 9 hours ago | parent [-]

It's showbiz. For those making actual money there, sacrificing dignity is the price of entry.

saubeidl 7 hours ago | parent | prev | next [-]

The world is going backwards rapidly. The worst people are once again welcomed into our now-crumbling society.

dev1ycan 4 hours ago | parent | prev | next [-]

Social media and lack of scientific research literacy is going to eventually prove to be fatal for modern society, even with this Tylenol thing, I have on one side people that believe a study blindly without reading that it's not taking into consideration several important variables and more studies are needed, and on the other hand I have people that did not read at all the study saying that it's impossible Tylenol could be causing anything because it is the only pain med pregnant women can take... clear non understanding of how controlled trials work...

Same thing with the UFO "Alien" video that was "shot down" by a hellfire missile (most likely a balloon), people just automatically assume that because it was said in congress it has to be true, zero analysis whatsoever of the footage or wanting to seek analysis by an expert, nope, it must be an alien.

There is so much misinformation, so much lack of understanding, and so many people, from every side that just have complete and utter lack of understanding of how seemingly basic things work, I am afraid for the future.

But yeah! let's unban unscientific sources, oh and people who are okay with a literal coup on a democracy.

TwoNineFive 6 hours ago | parent | prev | next [-]

They have a desperate need for false-victimhood.

Without their claim to victimization, they can't justify their hatred.

guelo 9 hours ago | parent | prev | next [-]

The amount of flagged hidden comments here by the supposed anti censorship side is almost funny.

dang 9 hours ago | parent [-]

If you (or anyone) run across a flagged comment that isn't tediously repeating ideological battle tropes, pushing discussion flameward, or otherwise breaking the site guidelines, you're welcome to bring it to our attention. So far, the flagged comments I've seen in this thread seem correctly flagged. But we don't see everything.

On this site, we're trying for discussion in which people don't just bash each other with pre-existing talking points (and unprocessed rage). Such comments quickly flood the thread on a divisive topic like this one, so flagging them is essential to having HN operate as intended. To the extent possible at least.

(oh and btw, though it ought to go without saying, this has to do with the type of comment, not the view it's expressing. People should be able to make their substantive points thoughtfully, without getting flagged.)

https://news.ycombinator.com/newsguidelines.html

croes 7 hours ago | parent | next [-]

Flagging isn’t the worst that can happen, you could also be rate limited what prevents you from answering in a discussion because of „you are posting too fast“

I know what I‘m talking about

dang 6 hours ago | parent [-]

Yes, when accounts have a pattern of posting too many unsubstantive and/or flamewar comments, we sometimes rate limit them.

We're happy to take the rate limit off once the account has built up a track record of using HN as intended.

croes 2 hours ago | parent [-]

What exactly is meant by track record?

alex1138 8 hours ago | parent | prev [-]

Yeah but in practice this isn't actually the case, people flag all the time for people just having a dissenting opinion, fitting none of the categories you mentioned

dang 8 hours ago | parent | next [-]

As mentioned, I haven't seen cases of that in the current thread. If there are any, I'd appreciate links. We don't see everything.

braiamp 8 hours ago | parent | prev [-]

There's one comment literally spreading misinformation and it isn't flagged, but instead got pushback by others, critically pointing the weakness of their arguments.

boxerab 7 hours ago | parent | prev | next [-]

tl;dr The Biden Administration has been caught using the government to force Twitter, YouTube and Facebook to censor its political enemies.

EasyMark 3 hours ago | parent [-]

They never forced them, and they certainly never said "that's a nice merger you got there, it would be awful if something were to happen to it" per the current policies of the US government.

terminalshort 2 hours ago | parent [-]

Yes they did https://www.npr.org/2021/07/22/1019346177/democrats-want-to-...

cbradford 9 hours ago | parent | prev | next [-]

So absolutely no one involved will have any repercussions. So they will all do it over again at the next opportunity

JumpCrisscross 9 hours ago | parent | next [-]

> they will all do it over again at the next opportunity

Future tense?

asadotzler 8 hours ago | parent | prev | next [-]

They are mega-corporations. They always do what ever the hell they want, certainly absent your input. Did you really believe they don't do what ever they want, because that's pretty damned naive.

johnnyanmac 9 hours ago | parent | prev | next [-]

yeah, 2025 in a nutshell. The year of letting all the grifts thrive.

lazyeye 9 hours ago | parent | prev [-]

What should the punishment be for having opinions the govt disagrees with?

Supermancho 9 hours ago | parent | next [-]

Promoting medical misinformation or even health misinformation should be critically judged. Alternative health companies are rubbing their hands together.

The next Drain-o chug challenge "accident" is inevitable, at this rate.

lazyeye 3 hours ago | parent [-]

That sounds great in theory. In practice, "misinformation" ends up being defined as anything the govt finds inconvenient. Or it is selectively applied so that when misinformation comes from all sides of the political spectrum, only people the govt doesnt like (in the more general sense) get kicked off platforms.

th0ma5 9 hours ago | parent | prev [-]

Notoriety

lazyeye 9 hours ago | parent [-]

Yep..and fame, admiration, contempt, loathing, indifference etc

jmyeet 8 hours ago | parent | prev | next [-]

First, let's dispense with the idea that anybody is a free speech absolutist. Nobody is. No site is. Not even 4chan is (ie CSAM is against 4chan ToS and is policed).

Second, some ideas just aren't worth distributing or debating. There's a refrain "there's no point debating a Nazi". What that means is there is a lot of lore involved with being a modern Nazi, a labyrinth of conspiracy theories. To effectively debate a Nazi means learning all that lore so you can dismantle it. There's no point. In reality, all you end up doing is platforming those ideas.

I'm actually shocked at how ostensibly educated people fall into the anti-vax conspiracy trap. Covid definitely made this worse but it existed well before then. Certain schools in San Francisco had some of the lowest child vaccination rates in the country.

As a reminder, the whole vaccine autism "theory" originated from one person: Andrew Wakefield. He was a doctor in the UK who was trying to sell a vaccine. The MMR vaccine was a direct compeititor so he just completely made up the MMR link to autism. He his medical license because of it. But of course he found a receptive audience in the US. He is and always was a complete charlatan.

Likewise, the Covid anti-vax movement was based on believing random Youtube videos from laymen and, in many cases, an intentional ignorance in the most esteemed traditions of American anti-intellectualism. People who are confidently wrong about provably wrong things who had no interest in educating themselves. Some were griters. Some were stupid. Many were both.

We had people who didn't understand what VAERS was. (and is). We had more than 10 million people die of Covid yet people considered the vaccine "dangerous" without any evidence of side effects let alone death. As one example, you had people yelling "J'accuse!" at hints of myocardial inflammation from the vaccine. But you know what else causes myocardial inflammation? Getting Covid.

If you're excited by this move, it just further highlights that you have no idea whta's going on and zero interest in the truth. What's happening here is big tech companies capitulating to the fringe political views of the administration, a clear First Amendment violation, to curry favor, get their mergers approved, get cgovernment contracts and so on.

Regardless of your views on this or any otehr issue you should care about capitulation by social media sites in this way.

This comments on this post are just a graveyard of sadness.

int_19h 7 hours ago | parent [-]

The problem with those "ideas that just aren't worth" is the usual, who decides?

In my country of origin, you get called a Nazi simply for being opposed to the war of aggression that it is currently engaged in. In US, we have a long history of "terrorist" and "extremist" being similarly abused.

jmyeet 7 hours ago | parent [-]

Do you think it's a good idea that this administration gets to decide what is and isn't acceptable speech? That's one of my points. So regardless of your positions on Covid and the 2020 you shouldn't celebrate this move because the government shouldn't have this kind of influence.

int_19h 3 hours ago | parent [-]

Oh, absolutely, I don't think this move by Google has anything to do with them being some kind of staunch free speech supporters. It's an obvious and rather pathetic attempt to suck up to the Trump administration, which itself is cancer when it comes to rights and freedoms. I'm no COVID denialist either.

I just don't think that "there's no point debating a Nazi" is, in general, a good argument in favor of censorship, whether public or private. It's one of those things that have a good ring to it and make some superficial sense, like "fire in the crowded theater", and then you look at how it works in the real world...

alex1138 9 hours ago | parent | prev | next [-]

So the other day, I linked to something on Rumble right here on Hacker News and was told to find a better source

First of all, you can't separate a thing's content from the platform it's hosted on? Really?

Second of all, this is why

I'll just go do this again and if you flag me it's on you, you have no standing to do it (the internet is supposed to be democratic, remember?)

https://rumble.com/v28x6zk-sasha-latypova-msc.-nsa-team-enig...

https://rumble.com/v3zh3fh-staggering-17m-deaths-after-covid...

https://rumble.com/vt62y6-covid-19-a-second-opinion.html

https://rumble.com/v2nxfvq-international-covid-summit-iii-pa...

I could go on. Feel free if you want to see more. :)

(Was it misinformation when Fauci said you shouldn't rush a vaccine or all hell breaks loose years later? Or when he intimated that masks wouldn't work for covid?)

braiamp 8 hours ago | parent | next [-]

The reason why you are asked better source is because, and let me say this slowly, anyone can post any crap on the internet without repercussions. Lets start with the one that references "Sasha Latypova". If I search her credentials she earned a title on Master of Business Administration, except that she used that to work as a co-founder of two companies, and none of them are even adjacent to pharmacology, but she is a "global PHARMA regulation expert". I'm sure that the other persons there will not have those issues, right?

1121redblackgo 8 hours ago | parent | prev [-]

Boo

moomoo11 11 hours ago | parent | prev | next [-]

I think hardware and ip level bans.. should be banned.

I know that some services do this in addition to account ban.

ocdtrekkie 10 hours ago | parent [-]

Any service which allows user generated content and allows arbitrary IP addresses to create infinite accounts is guaranteed to be overrun with CSAM. It's practically a law of physics.

jjk166 9 hours ago | parent [-]

If you actually cared about CSAM you would want those posting it to self incriminate and then face consequences in real life at the hands of actual authorities. Websites banning such posters only serves to alert them that they need to improve their tactics and give them the opportunity to hide. Removing only the offending content and alerting authorities is the appropriate thing a website like Youtube should be doing.

Even if one does argue that CSAM should result in hardware and IP bans, there's no reason that can't be a sole exception to a wider prohibition on such bans.

JumpCrisscross 9 hours ago | parent | next [-]

> If you actually cared about CSAM you would want those posting it to self incriminate and then face consequences in real life

We don’t have the resources for this, even when the FBI isn’t being purged and sent to Home Depots. Unrestricting IPs means a boom for CSAM production and distribution.

jjk166 8 hours ago | parent [-]

Well work on making those resources available instead of, again, informing CSAM creators how to better hide their activities. I fail to see how repeatedly removing CSAM from a single IP address is more of a boon to CSAM distributors than playing whackamole with multiple IP addresses. Wasting law enforcement resources on other things while CSAM producers are free to operate is a separate, and in my opinion much more pressing issue.

JumpCrisscross 8 hours ago | parent | next [-]

> informing CSAM creators how to better hide their activities

This adds to their risks and costs. That tips the economic balance at the margin. Actually going after all creators would require an international law-enforcement effort for which, frankly, there isn't political capital.

jjk166 7 hours ago | parent [-]

> This adds to their risks and costs. That tips the economic balance at the margin.

Charging would be bank robbers a fee to do practice runs of breaking into a vault adds to their costs; somehow that doesn't seem like an effective security measure.

> Actually going after all creators would require an international law-enforcement effort for which, frankly, there isn't political capital.

I'm not talking about going after all creators, just the ones you have the identifying information for which are so continuously pumping out such quantities of CSAM that it is impossible to stop the firehose by removing the content.

If you don't have the political capital to go after them, again you have bigger issues to deal with.

JumpCrisscross 6 hours ago | parent [-]

> somehow that doesn't seem like an effective security measure

…this is literally how we police bank theft. Most bank thieves are never caught because they can do it online from an unresponsive jurisdiction.

> just the ones you have the identifying information for

Sure. You’re still going to have a firehose of CSAM, and worse, newly-incentivised producers, if you turn off moderation.

squigz 8 hours ago | parent | prev [-]

> Wasting law enforcement resources on other things while CSAM producers are free to operate is a separate

It's been a long time since I had anything remotely to do with this (thankfully) but... I'm pretty sure there are lots of resources devoted to this, including the major (and even small) platforms working with various authorities to catch these people? Certainly to say they're "free to operate" requires some convincing.

jjk166 7 hours ago | parent [-]

Pick a lane. Either we have the resources to go after CSAM producers, in which case we should be using them; or we don't, in which case we should be getting those resources. In either scenario, banning IPs is a counterproductive strategy to combat CSAM and it is a terrible justification for permitting IP bans.

JumpCrisscross 6 hours ago | parent | next [-]

> Either we have the resources to go after CSAM producers, in which case we should be using them; or we don't, in which case we should be getting those resources

We don’t have the resources and we don’t want to divert them.

> banning IPs is a counterproductive strategy to combat CSAM and it is a terrible justification for permitting IP bans

The simple reason for banning Russian and Chinese IPs is the same as the reason I block texts from Vietnam. I don’t have any legitimate business there and they keep spamming me.

squigz 5 hours ago | parent | prev [-]

I'm not the one you were arguing with initially, I just wanted to address the idea that child abusers are just free to do whatever they want, and we're not doing anything about it.

ocdtrekkie 9 hours ago | parent | prev [-]

Yes, we should let people "self-incriminate" with Tor and disposable email services...

jjk166 8 hours ago | parent [-]

We're talking about websites like Youtube implementing hardware and IP bans. If your argument is that these are easily circumventable by CSAM distributors, that seems like all the more reason not to use them to combat CSAM.

reop2whiskey 8 hours ago | parent | prev | next [-]

is there any political censorship scheme at this large of scale in modern us history?

rimbo789 8 hours ago | parent [-]

Yes; the way the us government big business and the media, specifically Hollywood colluded during the Cold War

ironman1478 8 hours ago | parent | prev [-]

There isn't really a good solution here. A precedent for banning speech isn't a good one, but COVID was a real problem and misinformation did hurt people.

The issue is that there is no mechanism for punishing people who spread dangerous misinformation. It's strange that it doesn't exist though, because you're allowed to sue for libel and slander. We know that it's harmful, because people will believe lies about a person, damaging their reputation. It's not clear why it can't be generalized to things that we have a high confidence of truth in and where lying is actively harmful.

asadotzler 8 hours ago | parent | next [-]

No speech was banned. Google didn't prevent anyone from speaking. They simply withheld their distribution. No one can seem to get this right. Private corporations owe you almost nothing and certainly not free distribution.

ironman1478 8 hours ago | parent [-]

In the article it mentions that Google felt pressured by the government to take the content down. Implying that they wouldn't have if it wasn't for the government. I wasn't accusing Google of anything, but rather the government.

Maybe it's not banning, but it doesn't feel right? Google shouldn't have been forced to do that and really what should've happened is that the people that spread genuine harmful disinformation, like injecting bleach, the ivermectin stuff or the anti-vax stuff, should've faced legal punishment.

alex1138 8 hours ago | parent | prev | next [-]

Virtually all of the supposed misinformation turned out not to be that at all. Period, the end. All the 'experts' were wrong, all those that we banned off platforms (the actual experts) were right

reop2whiskey 8 hours ago | parent | prev [-]

What if the government is the source of misinformation?

ironman1478 8 hours ago | parent | next [-]

It's interesting you say that, because the government is saying Tylenol causes autism in infants when the mother takes it. The original report even says more verification is required and it's results are inconclusive.

I wouldn't be surprised if some lawsuit is incoming from the company that manufactures it.

We have mechanisms for combatting the government through lawsuits. If the government came out with lies that actively harm people, I hope lawsuits come through or you know... people organize and vote for people who represent their interests.

EasyMark 3 hours ago | parent | prev [-]

It certainly happens we're currently flooded with it from current regime

- Tylenol causes autism

- Vaccines cause autism

- Vaccines explode kids hearts

- Climate change is a hoax by Big Green

- "Windmill Farms" are more dangerous for the environment than coal

- I could go on but I won't