Non-profit tech outfit Mozilla has lobbed a hand grenade at Google-owned social video platform YouTube in the form of a 47-page report damning negative feedback methods (i.e., the Dislike button) its users have to control their viewing as being ineffective, citing thousands of analyzed user experiences. YouTube rejects the criticism, saying that the methodology doesn't take into account how its controls actually work.

In its report, Mozilla said it collected the experiences of 22,722 people through a browser extension it created called RegretsReporter. The extension generates a visible "Stop Recommending" button on top of video thumbnails on the site. However, specifically for the consenting research group, the signal Mozilla sent to YouTube would vary between different forms of feedback the site currently offers — nothing (for control purposes), Dislike, Not Interested (in video), Don't recommend channel, Remove from history.

The most effective method out of the four actual actions was the Don't recommend channel prompt, preventing 43% of unwanted recommendations. Users who removed videos from their watch history had a 29% success rate while Dislike and Not interested stacked up at 12% and 11%, respectively. For reference, the control group had a bad video recommendation rate of around 40%. Subtracting same-channel video recommendations, that number drops to somewhere closer to 30%. The overall rate of bad recommendations across all feedback modes decreased over time by at least 10 percentage points.

In a pre-trial survey of 2,758 people, Mozilla said respondents were generally doubtful about whether their actions have any effect at all on the algorithm that drives their video suggestions.

The organization concluded that YouTube should be more clear upfront about what its feedback tools do to alter users' experiences on the site and to provide more user controls that put substantial weight on the algorithm.

In a statement to The Verge, YouTube spokesperson Elena Hernandez says Mozilla's report relies on creating definitions that don't reflect what its feedback tools actually do: the "Not interested" option only removes that video as a factor for future recommendation while the "Don't recommend channel" option removes the entire channel.

Moreover, its algorithms are programmed irrespective of entire topics or viewpoints as to discourage users from forming echo chambers.

Hernandez also mentions that the company expanded its Data API access through the YouTube Research Program.

Mozilla countered this point by saying its research surfaced metrics the API wouldn't have made available. It also stands by its point that user feedback is generally less of a priority when YouTube embarks on a policy change.

When a mechanism such as YouTube's algorithm-driven video recommendation system operates in such an opaque way, Mozilla claims, people will use mechanisms given to them to try and control it.