Report finds that selecting “like” or “dislike” on YouTube doesn’t significantly alter user’s recommended videos
YouTube’s recommendation system confuses consumers and producers. Mozilla found that users’ YouTube suggestions don’t vary much whether they utilize “dislike” and “like.”
The study found that YouTube gave users comparable videos to those they had rejected, despite feedback methods and configuration changes. “like” and “dislike” only avoided 11% and 12% of negative suggestions. “Don’t suggest channel” and “delete from viewing history” reduced negative recommendations by 43% and 29%, respectively. Overall, research participants were unhappy with YouTube’s capacity to filter out poor suggestions.
Mozilla’s study studied more than 567 million movies using data from 22,722 users of its RegretReporter browser plugin. It surveyed 2,757 RegretReporter users for input.
78.3% of users utilized YouTube’s comment buttons, altered settings, or ignored videos to “train” the algorithm to recommend better content. 39.3% of those who tried to manage YouTube’s recommendations claimed they failed.
23% of those who tried to amend YouTube’s recommendation were unsatisfied. They noted undesired films coming back into the feed and needing a lot of time to modify suggestions.
Mozilla found that YouTube’s best methods for preventing incorrect suggestions couldn’t affect users’ feeds. They said “isn’t interested in hearing what its users want, preferring opaque ways that generate interaction regardless of its customers’ best interests.”
The group suggested YouTube build easy-to-understand user settings and offer academics detailed data to better understand its recommendation system.
Mozilla performed another YouTube-based study last year that found 71% of the videos people “regretted” seeing, including disinformation and spam. YouTube defended its choice to construct the current recommendation system and filter out “low-quality” video months after this report was released.
TikTok, Twitter, and Instagram are seeking to give people greater control over their feeds after years of depending on algorithms.
Global lawmakers are also examining how opaque social network recommendation systems harm users. The EU approved a Digital Services Act in April to improve platform algorithmic accountability, while the US is pursuing a bipartisan Filter Bubble Transparency Act.
Reference: https://techcrunch.com/2022/09/20/youtubes-dislike-and-not-interested-options-dont-do-much-for-your-recommendations-study-says/