Study suggests YouTube’s dislike button isn’t doing what you want it to do

A new study from Firefox developer Mozilla suggests that YouTube’s video moderation tools are ineffective as the website will continue to recommend videos you aren’t interested in.

The way it’s supposed to work is that users have several tools to teach YouTube’s enigmatic algorithm what they don’t want to watch. You have options like the Dislike button, the Don’t Recommend Channel option, and the ability to remove videos from your account’s history. But according to Mozilla’s study, users still get these “bad recommendations.” At best, YouTube’s tools cut down unwanted videos by almost half. At its worst, YouTube does the opposite: it increases the number of unwanted videos you’ll see.

The full 47-page study can be found on Mozilla’s website where it breaks down the researcher’s methodology, how the organization obtained the data, its findings, and what it recommends YouTube should do.

Mozilla’s findings

The study consisted of over 22,000 volunteers who downloaded Mozilla’s RegretsReporter browser extension which allows users to control recommendations on YouTube and create reports for the researchers. Via RegretsReporter, they analyzed well over 500 million videos. 

According to the findings, YouTube’s tools are all over the place in terms of consistency. 39.3 percent of participants didn’t see any changes to their recommendations. One user, named Participant 112 in the study, used the moderation tools to stop getting medical videos on their account only to be inundated with them a month later. 23 percent said they had a mixed experience. For that group, they stopped seeing unwanted videos for a while before having them reappear soon after. And 27.6 percent of participants did say they stopped getting the bad recommendations after using the moderation tools.

The most effective standalone tool turns out to be the Don’t Recommend Channel, which cut down recommendations by around 43 percent. The Not Interested option and Dislike button fared the worst as they only stopped 11 percent and 12 percent of unwanted videos, respectively.

Researchers also found that people would change their behavior to manage recommendations. In the study, users stated they would change YouTube settings, use a different account, or outright avoid watching certain videos lest they get more of them. Others would use VPNs and privacy extensions to help keep things clean.

At the end of the study, Mozilla researchers give their own recommendations on how YouTube should change its algorithm with most of the emphasis on increasing transparency. They want to see the controls be made easier to understand while also asking YouTube to listen to user feedback more often. Mozilla also calls for the platform to be more transparent on how its algorithm works.

YouTube’s response

In response , a YouTube spokesperson made a statement to The Verge criticizing the study. The spokesperson claims the researchers didn’t take into account how the “systems actually work” and misunderstood how the tools function. Apparently, the moderation tools don’t stop an entire topic, just that particular video or channel. By the researcher’s own admission, the study is “not a representative sample of YouTube’s user base,” but it does give some insight into user frustration.

That said, the YouTube algorithm and changes surrounding it have drawn considerable ire from users. Many were not happy that YouTube removed the Dislike counter from the website to the point where people have created extensions just to add it back in. Plus, there are claims that YouTube is capitalizing on controversial content to increase engagement. Presuming Mozilla’s data is correct, unwanted recommendations may be a byproduct of the platform capitalizing on content people don’t want in order to get more views.

If you’re interested in learning more about YouTube, be sure to check out TechRadar’s story on malware being spread through gaming videos