YouTube’s dislike button has little impact on the videos that the platform recommends to users, according to a study released Tuesday.
Researchers from the Mozilla Foundation, an international nonprofit that works to make the internet as accessible as possible, conducted an independent audit of YouTube’s user controls based on data that more than 20,000 users provided. It concluded that people generally do not feel that YouTube’s user controls change their recommendations and that users are largely correct that the mechanisms are “inadequate” in preventing unwanted recommendations.
Researchers found that user controls do influence what is recommended, but the effect is “negligible” and most unwanted videos still get through. Most users turn to a “trial-and-error approach” in trying to control their recommendations with tactics and behaviors like only rewatching desired videos or clearing their browser history, but reported limited success.
Researchers found examples of user controls often failing, such as an instance when a user asked to stop seeing videos related to firearms but was soon after recommended gun-related content. Another asked to stop seeing videos on cryptocurrency but continued to receive those recommendations.
Mozilla received more than 500 million video recommendations from the participants from December 2021 to June of this year.
The report made four proposals to YouTube and policymakers.
The first recommendation is that the platform’s user controls should be easy to understand and access, giving users clear information about the steps they can take to influence the recommendations they receive.
It also says YouTube should give more weight to user feedback in its recommendations and enhance its data access tools to allow researchers to analyze what influences the platform’s algorithm.
Researchers also called on lawmakers to pass or clarify laws that establish legal protections for public interest research.
YouTube spokeswoman Elena Hernandez said in a statement that the platform offers viewers control over their recommendations, including the ability to block a video or channel from being recommended. She said the controls do not filter out entire topics or viewpoints since this could create negative effects like echo chambers.
She said the platform welcomes academic research on its platform, which is why it recently expanded access to data through its YouTube Researcher Program.
"Mozilla’s report doesn’t take into account how our systems actually work, and therefore it’s difficult for us to glean many insights," Hernandez said.
Hernandez said YouTube is continually looking for new ways to increase transparency and collaborate with academic researchers. She said Mozilla's definition of "similar" videos does not take into account that viewers can request to avoid specific videos or channels but not topics or speakers entirely.
She said the Mozilla researchers acknowledged that their sample is not representative of the platform's user base, and YouTube's surveys show that users are generally satisfied with the recommendations they receive.