• Home
  • Internet
  • Shocking research from Mozilla on the YouTube algorithm: Are things working in reverse?

Shocking research from Mozilla on the YouTube algorithm: Are things working in reverse?

A study conducted by Mozilla, the architect of Firefox, says something we did not expect about the YouTube algorithm.
 Shocking research from Mozilla on the YouTube algorithm: Are things working in reverse?
READING NOW Shocking research from Mozilla on the YouTube algorithm: Are things working in reverse?

A new study by Firefox developer Mozilla suggests that YouTube’s video moderation tools are ineffective. In other words, voting “I don’t like” a video isn’t as effective as you might expect.

YouTube’s “mysterious” algorithm has several tools to find out what they don’t want to watch. You have options such as the dislike button, the Recommend Channel option, and the removal of videos from your account history. But according to Mozilla’s research, users still get these “bad suggestions”. YouTube’s tools aim to cut unwanted videos by almost half at best. In the worst case, YouTube does the opposite, increasing the number of unwanted videos you see.

The entire 47-page study is published on Mozilla’s website and includes the researcher’s methodology, how the organization obtained the data, the findings, and recommendations for what YouTube should do.

The study involved more than 22,000 volunteers who downloaded Mozilla’s RegretsReporter browser extension, which allows users to check recommendations on YouTube and generate reports for researchers. They analyzed over 500 million videos via RegretsReporter.

According to the findings, YouTube’s tools do not have any standards in terms of consistency. 39.3 percent of respondents say they don’t see any change in their recommendations. One user in the study, named Participant 112, used moderation tools to prevent medical videos from being shown on their account, but a month later, her account was flooded with them. 23 percent said they had a mixed experience. For this group, although they didn’t see the unwanted videos for a while, the same videos would reappear soon after. And 27.6 percent of respondents said they stopped taking bad advice after using moderation tools.

The most effective standalone tool seems to be Recommend Channel, which cuts recommendations by around 43 percent. The I’m not interested option and the Dislike button seem like the worst options, as they only stop 11 percent and 12 percent of unwanted videos, respectively.

The researchers also say that people change their behavior to manage the recommendations. The research states that users change their YouTube settings, use a different account, or completely avoid watching certain videos to avoid further recommendations. Others use VPNs and privacy extensions to help keep things clean.

At the end of the study, Mozilla researchers give their own advice on how YouTube should change its algorithm, with a strong emphasis on increasing transparency. They want to see the controls made easier to understand and at the same time ask YouTube to listen to user feedback more often. Also, Mozilla wants the platform to be more transparent about how its algorithm works.

YouTube’s response

In response, a YouTube spokesperson issued a statement to The Verge criticizing the work. The spokesperson claimed that the researchers did not take into account “how the systems actually work” and misunderstood how the tools work. Apparently, moderation tools don’t stop an entire topic, only a specific video or channel. As the researchers concede, this study is “not a representative sample of YouTube’s user base”, but it does provide some insight into user frustration.

Still, the YouTube algorithm and the changes surrounding it continue to draw huge backlash from users. Many people are not happy that YouTube has removed the Dislike counter from the website. There are also allegations that YouTube uses controversial content to increase engagement.

Assuming Mozilla’s data is accurate, the unsolicited referrals could be a byproduct of the platform’s exploitation of content that people don’t want to get more views…

Comments
Leave a Comment

Details
347 read
okunma2519
0 comments