▲ | the_af 5 days ago | |||||||
I wonder about this. I'm not discounting your experience, but my YouTube recommendation page is great. I only see my subscriptions, or things directly related to things I've watched and liked. If I remove a disliked video from my watch history, it "mostly" works to tell YouTube I don't want to see it anymore. I very seldom see crap I really do not want in my YouTube feed/recommendations. All I see are hobby videos and cartoon clips of things I like. This is totally unlike Facebook (where random garbage recommendations are the norm) or Reddit (which is hit or miss). | ||||||||
▲ | SoftTalker 5 days ago | parent | next [-] | |||||||
My recommendations are generally aligned with my interests as derived from my view history, likes, and subscriptions. But more and more of it is AI-generated or videos copied from the original creator and reposted by someone else. I try to use "don't show me videos fron this channel" on those but more and more just appears. I think there must be bots creating new channels and copying/generating content faster than I can block them. And please, let me opt out of Shorts permanently. I keep telling them I don't want shorts but they always come back. I pay for a Premium account, so they should resepect my wishes on this. | ||||||||
| ||||||||
▲ | andrewflnr 5 days ago | parent | prev | next [-] | |||||||
Same. My recommended feed is relatively ok, but I'm fairly ruthless with the "I don't want this" and "Don't recommend this channel" buttons. Meanwhile I've been off Facebook for years in large part because their feed appeared to be unsalvageable. | ||||||||
▲ | vorpalhex 5 days ago | parent | prev | next [-] | |||||||
I did an experiment where I really invested in my YouTube suggestions, and you can definitely groom your recommendations, and then they can be pretty good. But then you have an issue where you get into a new hobby or a new interest, and so you watch some videos attributed to that, your recommendations spiral back out of control. So you can do a whole bunch of grooming work, but probably they just go back to being like 80% wrong. I got vaguely interested in the piano, and now 80% of my recommendations are music related, but not actually things I care about, and they've just gone back to being total trash. | ||||||||
▲ | PaulHoule 5 days ago | parent | prev [-] | |||||||
On the computer attached to my stereo YouTube shows me almost 100% conservative, boring, safe but good music recommendations -- all things I've liked before, it rarely tries to show me anything new or challenging. On another browser it shows me mostly videos about stereo equipment. One yet another it shows me a mix of videos aimed at someone who listens to The Ezra Klein Show. That browser and the previous browser sometimes get a burst of videos about "How Brand X has lost its way" or "Why Y sucks today". One time on shorts I clicked on a video where an A.I. generated woman transforms into a fox on America's Got Talent and then after that it wanted to show me hundreds of A.I. slop videos of Chinese girls transforming into just about anything on the same show with the same music and the same reaction shots. If you click on a few Wheat Waffles videos you might quickly find your feed is nothing but blackpill incel videos and also videos that apply a blackpill philosophy to life such that not only is dating futile but everything else is futile too. The conclusion I draw from it is that you can't easily draw conclusions about the experience other people have with recommenders, it's one reason why political ads on social are so problematic, you can tell baldfaced lies to people who are inclined to believe them and skeptical people will never see them and hold anyone to account. |