Alice, Alice… I’m falling down YouTube’s algorithm hole

                                   Photo taken by researcher

On a Wednesday afternoon, 20-year-old Psychology student Hannah was sitting in University College’s Library when she typed in the word suicide prevention on YouTube’s search engine. The majority of the suggested videos were an array of music videos and not much informative content on suicide was being recommended. Moreover, as Hannah scrolled down, to her surprise, the platform suggested that she watch a video of Budd Dwyer committing suicide live on camera, a video which has over 500,000 views. Hannah was shocked. How could YouTube’s algorithm recommend such a revolting extremist video when that was not part of her search query. She thought to herself, what would the implications be if a child would have come across such a horrifying video.

Introduced in 2005, YouTube is the second most used search engine after Google. The platform hosts user friendly videos making it easy for individuals to create, discover and share video content. Over 2 billion users access the platform every month, with Americans as the largest audience (https://www.businessofapps.com/data/youtube-statistics/). According to Pew Research, the platform provides content for children, serves as a pass time for 28% of users and provides 19% of viewers with an update on current affairs (https://www.pewresearch.org/internet/2018/11/07/many-turn-to-youtube-for-childrens-content-news-how-to-lessons/). Therefore, it has become increasingly clear that YouTube is replacing traditional sources of media such as television, with all age demographics actively using the platform on a daily basis.

So how does it work?

YouTube’s recommendation algorithm functions by using machine learning to suggest videos. The platform is built on a software called deep learning neural networks which collects user’s personal data. The algorithm calculates users ‘watch time’, how long viewers have spent watching a video. By analysing such metrics, it suggests similar videos to maximise viewer duration on the platform. Personalizing content, it recommends videos that suit, and target users’ individuals tastes. The suggested videos will then appear in the ‘up next list’ or play automatically. The platform tends to push popular videos with ‘clickbait’ titles, headlines that are meant to attract and encourage users to click on a video link more, this in turn makes the company more profit. However, there have been concerns towards the platform’s capacity to steer individuals towards extremist content.

According to a former YouTube employee Guillaume Chaslo, “AI isn’t built to help you get what you want- it’s built to get you addicted to YouTube. Recommendations were designed to waste your time” (https://thenextweb.com/google/2019/06/14/youtube-recommendations-toxic-algorithm-google-ai/). Chaslo has underlined how the platform pushes clusters of entertainment videos as well as slightly extremist content from conspiracy theories, to political propaganda, to fake news as slightly controversial topics result in individuals clicking more because of human’s natural curiosity for the unknown.  

YouTube’s algorithm is designed to maintain user interest and increase watch time for as long as possible. Google, which in 2007 purchased YouTube for $1.65billion dollars has said that 400 hours of content are uploaded to YouTube every minute (https://www.nytimes.com/2017/11/04/business/media/youtube-kids-paw-patrol.html). By pushing content further to extremes, the algorithm steers viewers towards videos that will increase user engagement on the platform with users sharing their opinions in the comment section, re-playing videos and re-posting.

In effect, because of the low barriers of entry over 70% of the videos that the platform hosts are controversial, even though the majority of users engage less with such content (https://www.techspot.com/news/73178-youtube-recommended-videos-algorithm-keeps-surfacing-controversial-content.html). However, falling down YouTube’s algorithmic rabbit hole can be easy. In effect, children, teens and vulnerable individuals have become ever more exposed to such extremist content that might not be inappropriate for their age and can be triggering.

Should we be concerned?

YouTube has created an extension of the platform specifically for children called ‘YouTube Kids’ which allows parents to add filters and select appropriate video content. However, even this extension of the platform contains dark corners. Videos such as “PAW Patrol Babies Pretend to Die Suicide by Annabelle Hypnotized” have appeared on the platform and have triggered nightmares for some children. Mum of 3-year old Isaac, Staci Burns has told The New York Times’ “there are these horrible, people out there that just get their kicks off of making stuff like this to torment children”, having found ways to trick the YouTube Kids algorithm.

Although YouTube has appeared to obscure information regarding its recommendation algorithm, it has been working towards filtering and moderating extremist content and creating a user-friendly platform for all.  

A YouTube Kids spokesperson has said that the company is aiming in preventing similar situations to Staci’s in which inappropriate content is recommended to children. Moreover, the platform has been quite transparent in removing over 8 million videos since 2017. The videos that have been removed do not respect the community guidelines, as these included violent, hateful, unethical and graphic content  (https://transparencyreport.google.com/youtube-policy/featured-policies/violent-extremism?hl=en_GB).

In early December of 2019, two third year students at King’s College London, conducted an independent research, following their interview with student Hanna, to understand what YouTube’s recommendation algorithm would suggest when typing in the sensitive topic of suicide. Using YouTube’s Data Tool, Gephi, the students were able to map the different recommendation clusters in response to the keyword suicide. The researchers noticed that the most dominant cluster of videos were around entertainment, from films to music videos appearing to be more popular than suicide prevention documentaries. From this, the researchers were able to infer that YouTube tends to suggest its viewers with entertainment videos as these are more likely to increase viewer watch time on the platform, which in turn makes the company more profit.

Overall, YouTube continues to remain in the lead as one of the primary sources of video content. The platform has more than a billion users to date and has launched in 91 counties. Thus, it is clear that YouTube is one of the most visited websites worldwide. By 2025, the platform is expected to be the master of entertainment. In effect we can expect the recommendation algorithm to continue growing and hopefully improve in order to create a safer environment for all.