Is YouTube’s recommendation system asking you to become an extremist?

Currently China is undergoing a major health epidemic situation, and the whole world is paying great attention to the daily change and its affects. My friend Joseph, who is an Australian-born Chinese was also searching for the information on the YouTube platform. He typed ‘China coronavirus’ as key words in the search box and was looking for some on-site-videos in order to understand the situation better. As one of the major social events happened recently, lots of key western news agencies are actively participating in the news report and information update.

According to his search results, the first few videos are presented by some officials and well-known news organizations such as Channel 4 News, DW News, South China Morning Post etc., sorted by the degree of relevancy. Joseph clicked into the first video shown on the platform. It talked about the fact that other 13 countries outside China have outbreaks of the coronavirus.

Figure 1

At the same time, YouTube automatically played the next video which talked about the life of two US citizens who are stuck in Wuhan because of the city lockdown. They cannot leave the city in case they spread the virus. The city lockdown restricts all access to Wuhan and cuts off all exits to other places, is a controversial decision which causes intense discussions in the western world. Thanks to the recommendation system, Joseph started paying attention to the video which discusses the city lockdown. After that one, he was again recommended to watch videos about how the Chinese government manipulate social medias internally and externally etc. He ended up watching one talking about Chinese Communist Party and realized that he had spent even more time on those ‘fascinating-topic videos’ and forgot that he initially attempted to gain more knowledge about the recent situation of coronavirus.(Figure 1)

Joseph’s experience shows that YouTube’s recommendation system is not only offering similar contents based on your preferences but also leading users to some radical and extreme contents, in order to capture more attention and stay longer on the platform. According to a 2019 report, conducted by MIT technology review shows that 70% of what users watch on YouTube is fed to them through recommendation system, proving the powerful influence of the algorithm which shapes the information consumed. On the one hand, people are willing to watch the suggested videos which are offered based on their watching histories and predicted preferences. On the other hand, the main goal of YouTube recommendation system is to keep you watching as long as possible, by offering contents which is easily addictive to users. As Guillaume Chaslot, who used to work at Google on YouTube’s recommendation system, noted in his talk at 2019 DisinfoLab Conference, the motivation behind the algorithm is about watch time rather than what viewers want. 

 

Figure 2 (Gephi analysis)

This is where issue has risen that people are unconsciously watching the contents which are recommended by YouTube’s algorithm with some thought provoking contents, pushing you to waste more time on the platform and leading to the issue of misinformation. Since Google is managing and supporting the algorithms system behind the platform, Google has announced that they are working on addressing the issue and ‘begin to reduce recommendations of borderline content and content that could misinform users in harmful ways’, wrote in one of 2019 YouTube’s official blog. However, just like other big tech companies, Google never explains how its algorithms exactly work. (Figure 2)

Although the response from Google seems like a good sign it still is a serious problem. The recommended-videos sidebar is supposed to create a basic structure which will recommend a shortlist of contents based on the topic and other features of videos you are watching. Then the system will automatically learn from your likes, clicks, searching words and other interactions on the platform, adding other contents into the list on the sidebar. However, as the ranking of the list has great impact on user’s watching experiences, it is arguably questioned that whether those recommendations are shown because you like it or because the algorithm recommend it. (Figure 3)

Figure 3

In fact, a journal written by Zeynep Tufekci and was published on Scientific American website, says that the business model of YouTube is to keep users to stay on the platform and watch as many targeted ads as possible. Therefore, the algorithm is likely to promote contents which can greatly attract attention, and it seems that those borderline contents with crazy claims or radical viewpoints are more attractive and engaging, which will be recorded by machine-learning system and recommend similar contents to keep you watching, in order to make greater revenue. Affected by such algorithm, viewers always end up being pushed to extreme content, where the contents are possibly filled with fake new or conspiracy theories.

However, some people believe that we should not simply put the blame on YouTube’s algorithm, instead, users themselves who are involved in the whole process have some responsibility. According to an academic journal, written by Ariadna Matamoros-Fernández and Joanne Gray in 2019, ‘Users are not passive participants in the algorithm system’, instead, some video creators understand how the algorithm work and adjust their strategies in order to get the most recommendations on purpose. As a public community, YouTube also provides an opportunity for those extremist content creators, who are making use of the platform for propaganda. (Figure 4)

Figure 4

Although YouTube has already restricted contents with obvious bad influence such as hate speech or violence, as Ariandna and Joanne argued, it is very hard for an algorithm to figure out hidden meaning of the contents which are in the grey areas. This shows that the system is unable to accurately monitor and manage those video producers who are playing around in the shady ground, making contents possibly be misunderstood in different perspectives by various culture groups in the society. For example, Vox reporter Carlos Maza had a big online fight with a famous comedian Steven Crower in June 2019, arguing that Steven always made fun of his sexual orientation and ethnicity by using sarcastic tone and attacking language in the YouTube video. However, Steven believed that videos were simply ‘friendly ribbing’. After several vague posts he made on Twitter, YouTube stated to the public that he didn’t break the hate speech rules so that the video was allowed to continue and be distributed on the platform. The result, however, disappointed lots of subculture groups.

So, is YouTube the only one who should take responsibility of the issue? Ben McOwen Wilson, YouTube’s managing director told to BBC during an interview in 2019, that YouTube is currently dedicated to dealing with the issue of misinformation and conspiracies, however, it also requires for the joint efforts from government and other major online platforms such as Facebook and Twitter. While Joseph can’t avoid recommendations from YouTube, it is better to also have a look on other platforms, in order to be recommended in different perspectives and create his own viewpoints eventually.(Figure 5)

Figure 5