YouTube’s algorithm doesn’t care if videos are “thumbs down”



Screenshot on YouTube with hovering over the dislike button.

YouTube has already stopped showing negative ratings in videos received, but apparently showing a thumbs-down movie does not change the number of similar videos the platform recommends.
Picture: Chiwi (Shutterstock)

My YouTube recommendations are full of old Gordon Ramsay’s “Kitchen of Nightmares” reruns. It may be partly my mistake that I got drunk one evening and watched the entire episode. Let me tell you, if there’s one thing I don’t want on my channel anymore is a famous hardcore Brit smashing another chef with some of the most nasty sound effects in the world (braaa-reeeee) shuffle in the background. I didn’t like a lot of these videos, but now Hell’s Kitchen is on my website and I feel more and more like a “raw” steak that Ramsay pokes and criticizes.

But apparently I’m not alone with my YouTube recommendation issues. Mozilla Foundation Report released on Monday claims, based on a survey and crowdsourced data that “dislike“And” do not recommend channel “do not actually change video recommendations.

Well, there are two points here. One is that users constantly feel that the controls provided by Google are not working they actually make a difference. Second, based on data collected from users, the controls have an “insignificant” effect on the recommendations, meaning that “most unwanted videos still slither through”.

The Foundation relied on data from its own Regret reporter A browser plug-in tool that allows users to block selected YouTube videos from appearing on their channel. The report said it based its analysis on nearly 2,757 survey respondents and 22,722 people who gave Mozilla access to more than 567 million video recommendations downloaded between the end of 2021 and June 2022.

Although researchers admit that survey respondents are not a representative YouTube sample huge and varied audience, a third of those polled said that using YouTube controls did not change their video recommendations at all. One user told Mozilla that he would report the videos as deceptive or spam and they would return to their channel later. Respondents often stated that blocking one channel would only lead to recommendations from similar channels.


div class=”bxm4mm-16 jvTtVq”>


div class=”sc-1atgi65-0 sc-1atgi65-1 bdNdA-D js_commerce-inset-permalink” data-inset-url=”″ data-inset-category=”CommerceInsetMobile”>

G / O Media may receive a commission


div class=”sc-1atgi65-3 hMBKyp creative-type-A”>


div class=”sc-1atgi65-6 gLSKws”>


div class=”sc-1atgi65-10 hherpH js_lazy-image”>

21% discount

50-inch Amazon Fire 4K TV with a 4-year protection plan

Keep it face down
This means you’ll be protected from mechanical and electrical breakdowns and the breakdowns of your 4K Ultra HD TV, which has Alexa control and acts as the center of many streaming services, making them not only more accessible, but also looking fantastic.

YouTube’s algorithm recommends users to videos they don’t want to watch, and it’s often worse than just an old Ramsay cable. A 2021 Mozilla report, again based on crowdsourced user data, found that viewers of the video platform are regularly recommended content that contains violence, hate speech and political disinformation.

In a recent report, Mozilla researchers found that pairs of videos, including those rejected by users such as Tucker Carlson screed, this would make another video from the Fox News YouTube channel recommended. Based on a review of 40,000 pairs of videos, often when one channel is blocked, the algorithm simply recommends very similar videos from similar channels. Using the “I don’t like” or “I don’t care” buttons prevented 12% and 11% of unwanted recommendations, respectively, compared to the control group. Using the “do not recommend channel” and “remove from watch history” buttons improved user channels more effectively, but only by 43% and 29% respectively.

“In our data analysis, we found YouTube’s user controls to be insufficient as tools to prevent unwanted recommendations,” Mozilla researchers wrote in their study.

YouTube spokeswoman Elena Hernandez told Gizmodo in an email that “Our controls are not filtering out entire topics or viewpoints as this could have negative effects on viewers, such as creating echo chambers.” The company says it doesn’t prevent it from recommending all content on related topics, but also says it promotes “authoritative” content while suppressing “borderline” videos that come close to violating content moderation rules.

In 2021 blog post, Cristos Goodrow – YouTube’s vice president of engineering – wrote that their system is “constantly evolving,” but making their algorithm transparent is “not as simple as providing a recommendation formula” because their systems are click-aware, watch timing, survey responses, sharing, likes and dislikes.

Of course, like any social media platform, YouTube has struggled to create systems that can fight the entire spectrum bad and even predatory content sent to the site. One upcoming book shared exclusively with Gizmodo YouTube said that YouTube has come close to taking billions of dollars in ad revenue to deal with weird and disturbing videos recommended to kids.

While Hernandez claimed the company had expanded its API dataSpokesperson added: “Mozilla’s report does not take into account how our systems actually work, so it is difficult for us to gather a lot of insights.”

But it’s a criticism Mozilla also puts at Google’s feet, saying the company isn’t providing enough access to allow researchers to evaluate what influences YouTube’s secret sauce, their algorithms.

Share this:

Leave a Reply