Did you fall down a YouTube rabbit hole? Good Luck Getting Out: Study Shows ‘I Don’t Like It’ Doesn’t Work

If you’ve ever searched for something innocuous on YouTube but ended up down a rabbit hole of extreme or unpleasant content, then you’re familiar with the frustrations of the platform’s algorithms.

A new report from Mozilla, the nonprofit organization behind the Firefox browser, shows that controls in the YouTube app, including the “dislike” button and “not interested” feature, are ineffective.

The researchers used data collected from RegretsReporter, which is their browser extension that allows people to “donate” their recommendation data for use in studies.

The report was based on more than 567 million videos from 22,722 users in total and covered a time period from December 2021 to June 2022.

If you've ever searched for something innocuous on YouTube but ended up down a rabbit hole of extreme or unpleasant content, then you're familiar with the frustrations of the platform's algorithms.

If you’ve ever searched for something innocuous on YouTube but ended up down a rabbit hole of extreme or unpleasant content, then you’re familiar with the frustrations of the platform’s algorithms.

A new report from Mozilla, the nonprofit organization behind the Firefox browser, shows that controls in the YouTube app, including the “Dislike” button and “Not interested” feature, are ineffective.

Of the four main controls Mozilla tested, only ‘do not recommend from channel’ was effective: it prevented 43 percent of unwanted recommendations. However, the ‘I don’t like’ button and the ‘I’m not interested’ feature were hardly helpful, only preventing 12 to 11 percent of spam suggestions.

Several participants who volunteered to share their opinions in a survey with Mozilla told the nonprofit that they often went to great lengths to avoid unwanted content that YouTube’s algorithms kept showing them.

At least 78.3 percent of survey participants said they used YouTube’s existing commenting tools and/or changed the platform’s settings. More than a third of the participants said that using the YouTube controls did not change their recommendations at all.

“Nothing changed,” said one survey participant. ‘Sometimes I would report things like misleading and spam and the next day it would show up again. She almost feels as if the most negative feedback she gives to his suggestions is the highest mountain. Even when you block certain sources, they eventually come back.’

The report was based on more than 567 million videos from 22,722 users in total and covered a time period from December 2021 to June 2022.

The report was based on more than 567 million videos from 22,722 users in total and covered a time period from December 2021 to June 2022.

The report was based on more than 567 million videos from 22,722 users in total and covered a time period from December 2021 to June 2022.

Mozilla also found that some users were shown graphic content, firearms, or hate speech, in violation of YouTube's content policies, even though they submitted negative comments using the company's tools.

Mozilla also found that some users were shown graphic content, firearms, or hate speech, in violation of YouTube's content policies, even though they submitted negative comments using the company's tools.

Mozilla also found that some users were shown graphic content, firearms, or hate speech, in violation of YouTube’s content policies, even though they submitted negative comments using the company’s tools.

Another participant said that the algorithm changed in response to their actions, but not in a good way.

‘Yes, they changed, but in a bad way. In a way, I feel punished for trying to proactively change the behavior of the algorithm. Somehow less interaction provides less data on which to base recommendations.’

Mozilla also found that some users were shown graphic content, firearms, or hate speech, in violation of YouTube’s own content policies, even though they submitted negative comments using the company’s tools.

The researchers determined that YouTube’s user controls left viewers feeling confused, frustrated and out of control of their experience on the popular platform.

‘People feel that using YouTube’s user controls doesn’t change their recommendations at all. We learned that many people take a trial and error approach to checking their recommendations, with limited success’, the report state

YouTube’s user control mechanisms are inadequate to prevent unwanted recommendations. We found that YouTube’s user controls influence what is recommended, but this effect is negligible and most unwanted videos are still filtered out.”

DailyMail.com has contacted YouTube for comment and will update this story as necessary. Mozilla recommends a number of changes to how the platform’s user controls work to improve them for users.

For example, tools should use plain language about exactly what action is being taken, so instead of “I don’t like this recommendation,” it should say “Block future recommendations on this topic.”

‘YouTube should make major changes to the way people can shape and control their recommendations on the platform. YouTube should respect the comments that users share about their experience, treating them as meaningful signals about how people want to spend their time on the platform, “says the non-profit organization in the conclusion of the report.

“YouTube should review its ineffective user controls and replace them with a system where people’s satisfaction and well-being are treated as the most important signals.”

“YouTube should review its ineffective user controls and replace them with a system where people’s satisfaction and well-being are treated as the most important signals.”

.