Business is booming.

Did you fall down a YouTube rabbit hole? Good luck getting out – study shows ‘dislike’ does NOT work

If you’ve ever searched for something innocent on YouTube but ended up down a rabbit hole of extreme or distasteful content, then you’re familiar with the frustrations of the platform’s algorithms.

A new report from Mozilla, the nonprofit organization behind the Firefox browser, shows that YouTube’s in-app controls – including the ‘dislike’ button and the ‘not interested’ feature – are ineffective.

Researchers used data collected from RegretsReporter, which is its browser extension that allows people to ‘donate’ their recommendation data for use in studies.

The report was based on over 567 million videos from a total of 22,722 users and covered a time period from December 2021 to June 2022.

If you've ever searched for something innocent on YouTube but ended up down a rabbit hole of extreme or distasteful content, then you're familiar with the frustrations of the platform's algorithms

If you’ve ever searched for something innocent on YouTube but ended up down a rabbit hole of extreme or distasteful content, then you’re familiar with the frustrations of the platform’s algorithms

A new report from Mozilla, the nonprofit behind the Firefox browser, shows that YouTube's in-app controls - including the 'dislike' button and the 'not interested' feature - are ineffective

A new report from Mozilla, the nonprofit behind the Firefox browser, shows that YouTube's in-app controls - including the 'dislike' button and the 'not interested' feature - are ineffective

A new report from Mozilla, the nonprofit behind the Firefox browser, shows that YouTube’s in-app controls – including the ‘dislike’ button and the ‘not interested’ feature – are ineffective

Of the four main controls Mozilla tested, only ‘do not recommend from channel’ was effective – it prevented 43 percent of unwanted recommendations. However, the ‘dislike’ button and ‘not interested’ feature were hardly helpful, preventing only 12 and 11 per cent of unwanted suggestions.

A number of participants who volunteered to share their opinions in a survey with Mozilla told the nonprofit that they often went to great lengths to avoid unwanted content that YouTube’s algorithms kept showing them.

At least 78.3 percent of survey participants said they used YouTube’s existing feedback tools and/or changed the platform’s settings. More than a third of participants said that using YouTube’s controls did not change their recommendations at all.

“Nothing changed,” said one survey participant. ‘Sometimes I reported things like misleading and spam and the next day it was back. It almost feels like the more negative feedback I give to their proposal, the higher the bulls**t mountain gets. Even when you block certain sources, they eventually come back.’

The report was based on over 567 million videos from a total of 22,722 users and covered a time period from December 2021 to June 2022

The report was based on over 567 million videos from a total of 22,722 users and covered a time period from December 2021 to June 2022

The report was based on over 567 million videos from a total of 22,722 users and covered a time period from December 2021 to June 2022

Mozilla also found that some users were being shown graphic content, firearms or hate speech, in violation of YouTube's own content policies, despite submitting negative feedback using the company's tools

Mozilla also found that some users were being shown graphic content, firearms or hate speech, in violation of YouTube's own content policies, despite submitting negative feedback using the company's tools

Mozilla also found that some users were being shown graphic content, firearms or hate speech, in violation of YouTube’s own content policies, despite submitting negative feedback using the company’s tools

Another participant said the algorithm changed in response to their actions, but not in a good way.

‘Yes, they changed, but in a bad way. In a way, I feel penalized for proactively trying to change the behavior of the algorithm. In some ways, less interaction provides less data to base recommendations on.’

Mozilla also found that some users were being shown graphic content, firearms or hate speech, in violation of YouTube’s own content policies, despite submitting negative feedback using the company’s tools.

The researchers determined that YouTube’s user controls left viewers feeling confused, frustrated and out of control of their experience on the popular platform.

‘People feel that using YouTube’s user controls doesn’t change their recommendations at all. We learned that many people take a trial-and-error approach to verifying their recommendations, with limited success,’ report states.

‘YouTube’s user control mechanisms are insufficient to prevent unwanted recommendations. We found that YouTube’s user controls influence what is recommended, but this effect is negligible and most unwanted videos still slip through.’

DailyMail.com contacted YouTube for comment and will update this story as necessary. Mozilla recommends a number of changes to how the platform’s user controls work to make them better for users.

The tools must e.g. use plain language about exactly what is being done – so instead of ‘I don’t like this recommendation’ it should say ‘Block future recommendations on this topic’.

‘YouTube should make major changes to how people can shape and control their recommendations on the platform. “YouTube should respect the feedback users share about their experience and treat them as meaningful signals about how people want to spend their time on the platform,” the nonprofit organization states in the report’s conclusion.

“YouTube should overhaul its ineffective user controls and replace them with a system where people’s satisfaction and well-being are treated as the most important signals.”

1663694434 271 Did you fall down a YouTube rabbit hole Good luck

1663694434 271 Did you fall down a YouTube rabbit hole Good luck

“YouTube should overhaul its ineffective user controls and replace them with a system where people’s satisfaction and well-being are treated as the most important signals”