Remix.run Logo
woeirua 12 hours ago

It seems to me that a lot of people are missing the forest for the trees on misinformation and censorship. IMO, a single YouTube channel promoting misinformation, about Covid or anything else, is not a huge problem, even if it has millions of followers.

The problem is that the recommendation algorithms push their viewers into these echo chambers that are divorced from reality where all they see are these videos promoting misinformation. Google's approach to combating that problem was to remove the channels, but the right solution was, and still is today, to fix the algorithms to prevent people from falling into echo chambers.

CobrastanJorji 11 hours ago | parent | next [-]

Yeah, there are two main things here that are being conflated.

First, there's YouTube's decision of whether or not to allow potentially dangerous misinformation to remain on their site, and whether the government can or did require them to remove it.

Second, though, there's YouTube's much stronger editorial power: whether or not to recommend, advertise, or otherwise help people discover that content. Here I think YouTube most fairly deserves criticism or accolades, and it's also where YouTube pretends that the algorithm is magic and neutral and they cannot be blamed for actively pushing videos full of dangerous medical lies.

stronglikedan 12 hours ago | parent | prev | next [-]

The problem is that misinformation has now become information, and vice versa, so who was anyone to decide what was misinformation back then, or now, or ever.

I like the term disinformation better, since it can expand to the unfortunately more relevant dissenting information.

3cKU 11 hours ago | parent [-]

[dead]

asadotzler 10 hours ago | parent | prev | next [-]

Why. Why is Google obligated to publish your content? Should Time Magazine also give you a column because they give others space in their pages? Should Harvard Press be required to publish and distribute your book because they do so for others.

These companies owe you nothing that's not in a contract or a requirement of law. That you think they owe you hosting, distribution, and effort on their algorithm, is a sign of how far off course this entire discourse has moved.

kypro 12 hours ago | parent | prev | next [-]

I've argued this before, but the algorithms are not the core problem here.

For whatever reason I guess I'm in that very rare group that genuinely watches everything from far-right racists, to communists, to mainstream media content, to science educational content, to conspiracy content, etc.

My YT feed is all over the place. The algorithms will serve you a very wide range of content if you want that, the issue is that most people don't. They want to hear what they already think.

So while I 100% support changing algorithms to encourage more diversity of views, also I think as a society we need to question why people don't want to listen to more perspectives naturally? Personally I get so bored here people basically echo what I think. I want to listen to people who say stuff I don't expect or haven't thought about before. But I'm in a very significant minority.

woeirua 11 hours ago | parent [-]

I might agree that the algos making recommendations on the sidebar might not matter much, but the algos that control which videos show up when you search for videos on Google, and also in YouTube search absolutely do matter.

theossuary 12 hours ago | parent | prev | next [-]

The problem with this is that a lot of people have already fallen into these misinformation echo chambers. No longer recommending them may prevent more from becoming unmoored from reality, but it does nothing for those currently caught up in it. Only removing the channel helps with that.

hsbauauvhabzb 11 hours ago | parent | next [-]

Algorithms that reverse the damage by providing opposing opinions could be implemented.

amanaplanacanal 8 hours ago | parent [-]

Why would Google ever do that? People are likely to leave YouTube for some other entertainment, and then they won't see more ads.

hsbauauvhabzb 35 minutes ago | parent [-]

I agree. My point was that it is possible. Google would never do it without being forced.

squigz 11 hours ago | parent | prev [-]

I don't think those people caught up in it are suddenly like "oop that YouTuber is banned, I guess I don't believe that anymore". They'll seek it out elsewhere.

int_19h 9 hours ago | parent | next [-]

If anything, these people see the removal of their "favorite" videos as validation - if a video is removed, it must be because it was especially truthful and THEY didn't like that...

theossuary 6 hours ago | parent | prev [-]

It's actually been showed many times that deplatforming significantly reduces the number of followers an influencer has. Many watch out of habit or convenience, but won't follow when they move to a platform with less moderation.

terminalshort 12 hours ago | parent | prev [-]

The algorithm doesn't push anyone. It just gives you what it thinks you want. If Google decided what was true and then used the algorithm to remove what isn't true, that would be pushing things. Google isn't and shouldn't be the ministry of truth.

woeirua 11 hours ago | parent | next [-]

Exactly, they shouldn't be the ministry of truth. They should present balanced viewpoints on both sides of controversial subjects. But that's not what they're doing right now. If you watch too many videos on one side of a subject it will just show you more and more videos reinforcing that view point because you're likely to watch them!

terminalshort 5 hours ago | parent [-]

Why should Youtube try to tell me what it thinks I should want to watch instead of what I actually want to watch? I'm not particularly interested in their opinion on that matter.

TremendousJudge 12 hours ago | parent | prev [-]

"what it thinks you want" is doing a lot of work here. why would it "think" that you want to be pushed into an echo chamber divorced from reality instead of something else? why would it give you exactly what you "want" instead of something aligned with some other value?

terminalshort 5 hours ago | parent [-]

Given the number of people that describes it's pretty clear that people do want that. It's not exactly a new and surprising thing that people want things that are bad for them.