YouTube’s algorithm recommends videos that violate its own company policies on inappropriate content, according to a study performed.
The Mozilla Foundation organization has released new data that YouTube continues to recommend videos to users in violation of its own content policies.
According to the latest Mozilla poll, 3,362 videos were flagged as “regrettable” between July 2020 and May 2021. 71 percent of videos that users deem inappropriate were presented by YouTube’s recommendation algorithm.
Researchers watched the reported videos and compared YouTube content guidelines. It found that 12.2 percent of the videos reported were not found on YouTube.
About a fifth of the videos reported are registered in YouTube’s rules as disinformation, and another 12 percent spread covid-19 disinformation, the researchers say. Other problems flagged in the survey include violent or explicit content and hate speech.
“Some of our findings, if extended to the YouTube user database, would raise significant questions and be really worrying,” says Brandi Geurkink of Mozilla Germany. “What we found is the tip of the iceberg.”
The findings are useful because YouTube’s algorithm is essentially a black box. Every user sees something different in their feeds based on their previous searches, so it’s nearly impossible to qualify whether YouTube is fulfilling its stated mission of getting rid of harmful content.
“This highlights the need to adapt moderation decisions by country and ensure that YouTube has expert moderators who know what’s going on in each country,” said Savvas Zannettou of the Max Planck Institute for Informatics in Germany.
YouTube has responded over the years with tweaks that help suppress “dangerous” videos that mark the line between what’s acceptable and what violates its terms of service.
YouTube has already come to the public to inform that it is extremely complicated to keep control of videos because every day billions of hours of video are published.