A new study by Firefox developer Mozilla suggests that YouTube’s video moderation tools are ineffective, as the website will continue to recommend videos you’re not interested in.
The way it’s supposed to work is that users have more tools to teach YouTube’s enigmatic algorithm what they don’t want to see. You have options like the Dislike button, the Don’t Recommend Channel option, and the ability to remove videos from your account history. But according to Mozilla’s survey (opens in new tab), users still get these “bad recommendations”. At best, YouTube’s tools cut unwanted videos by almost half. At its worst, YouTube does the opposite: it increases the number of unwanted videos you’ll see.
That the entire 47-page study can be found on Mozilla’s website (opens in new tab) where it breaks down the researcher’s methodology, how the organization obtained the data, its findings, and what it recommends YouTube should do.
The study consisted of over 22,000 volunteers who downloaded Mozilla’s Regrets Reporter (opens in new tab) browser extension that allows users to manage recommendations on YouTube and create reports for researchers. Via RegretsReporter, they analyzed well over 500 million videos.
According to the results, YouTube’s tools are all over the place in terms of consistency. 39.3 percent of participants saw no changes in their recommendations. One user, called Study Participant 112, used the moderation tools to stop getting medical videos on their account, only to be flooded with them a month later. 23 percent said they had a mixed experience. For that group, they stopped watching unwanted videos for a while before having them reappear shortly after. And 27.6 percent of participants said they stopped getting the bad recommendations after using the moderation tools.
The most effective standalone tool turns out to be the Recommend Non-Channel, which reduced referrals by about 43 percent. The Not Interested option and the Dislike button fared the worst, stopping only 11 percent and 12 percent of unwanted videos, respectively.
Researchers also found that people would change their behavior to manage recommendations. In the survey, users stated that they would change YouTube settings, use a different account, or even avoid watching certain videos in order not to get more of them. Others would use VPNs and privacy extensions to help keep things clean.
At the end of the study, Mozilla researchers give their own recommendations on how YouTube should change its algorithm, with most of the emphasis on increasing transparency. They’d like to see the controls made easier to understand, while asking YouTube to listen to user feedback more often. Mozilla is also calling for the platform to be more transparent about how its algorithm works.
In response, a YouTube spokesperson made one statement to The Verge (opens in new tab) criticizes the study. The spokesperson claims the researchers did not take into account how “the systems actually work” and misunderstood how the tools work. Apparently, the moderation tools don’t stop an entire topic, just that particular video or channel. By the researcher’s own admission (opens in new tab)the survey is “not a representative sample of YouTube’s user base,” but it does provide some insight into user frustration.
That being said, the YouTube algorithm and the changes surrounding it have caused considerable anger among users. Many were not happy about it YouTube removed the Dislike counter from the homepage to the point where people have created extensions just to add it back. In addition, there are claims about it YouTube exploits controversial content to increase engagement. Assuming Mozilla’s data is correct, unwanted recommendations could be a byproduct of the platform leveraging content people don’t want to get more views.
If you are interested in learning more about YouTube, be sure to check out TechRadar’s story on malware spreading through gaming videos.