| ▲ | theossuary 12 hours ago |
| The problem with this is that a lot of people have already fallen into these misinformation echo chambers. No longer recommending them may prevent more from becoming unmoored from reality, but it does nothing for those currently caught up in it. Only removing the channel helps with that. |
|
| ▲ | hsbauauvhabzb 11 hours ago | parent | next [-] |
| Algorithms that reverse the damage by providing opposing opinions could be implemented. |
| |
| ▲ | amanaplanacanal 8 hours ago | parent [-] | | Why would Google ever do that? People are likely to leave YouTube for some other entertainment, and then they won't see more ads. | | |
|
|
| ▲ | squigz 11 hours ago | parent | prev [-] |
| I don't think those people caught up in it are suddenly like "oop that YouTuber is banned, I guess I don't believe that anymore". They'll seek it out elsewhere. |
| |
| ▲ | int_19h 10 hours ago | parent | next [-] | | If anything, these people see the removal of their "favorite" videos as validation - if a video is removed, it must be because it was especially truthful and THEY didn't like that... | |
| ▲ | theossuary 6 hours ago | parent | prev [-] | | It's actually been showed many times that deplatforming significantly reduces the number of followers an influencer has. Many watch out of habit or convenience, but won't follow when they move to a platform with less moderation. |
|