Youtube Enhances Recommendation Feature: Possibility of Misinformation and Bubble

By Shreyashka Vikram Raj Maharjan on Dec 07,2017 - 20:33

Google’s YouTube updated its recommendation feature which spotlights videos that are sequenced according to the users watched history and videos the users find most gratifying, and shines light off of other contents. This has raised flags stating how this might trap people in a bubbles of misinformation and like-minded opinions. This cuts out chances of counter opinion video and content, as they contradict the users values and thought process.

But Jim MacFadden and Cristos Goodrow who work on the recommendation technology believe, “The goal is to prevent the negative sentiments that can arise when people watch hours and hours of uninspired programs.”

But the question arises what if the “inspiring” content is misinformation and misleading content or propaganda?


We saw how in 2016 Russia exploited the recommendation feature of top social sites such as Facebook INC, Twitter INC and YouTube to popularize propaganda and fake news during the 2016 US presidential election. The companies then responded with a tighter user verification algorithm and fact-checking technology.“The risk is not that we are just siloing ourselves, but we’re able to also reinforce per-existing, flawed viewpoints,” said Jacob Groshek, a Boston University associate professor who researches the influence of social media and “filter bubbles.”


YouTube recommends videos automatically through a machine learning algorithm which analyses the characteristics of the videos watched by the 1.5 billion users and generates personalised viewing recommendations. 

This update which went live in January 2017, had not been reported which is a result of lack of public outcry. YouTube shall not change how it recommends videos until an unless the public stand, till then users are stuck in a loop of content which YouTube thinks is best for them.


Originally Reported By Reuters


Tags -  Google , Youtube