Falling down YouTube’s Rabbit Hole: How does their Recommendation Algorithm Work?

Have you ever started watching a video on YouTube and then hours later realise you have fallen into a crazy YouTube rabbit hole? We’ve all been there. But why is it that YouTube can drift us so far away from the original video we started at, and who decides what videos are recommended?

hacking-2903156_1920
Image: By PixaBay

That would be YouTube’s recommendation algorithm. The software is responsible for what videos are seen in the “Up Next” column and on user’s homepages. Over 70% of all time spent on YouTube by users is spent watching videos recommended by the platform’s algorithm. With that much power over what we watch, we should find out how it works.

There is no public information into the exact specificities of the algorithm, however, according to Google, who created the current algorithm for YouTube made by their Artificial Intelligence company, it works primarily through 2 steps:

1) Personalisation: This is when the algorithm selects videos for the particular user based on videos that were watched by other users who watch similar videos to that user and also are similar demographically.

2) Ranking: This involves the selected videos being put into order of how likely the user will watch them, using data such as how many videos they have already watched on that channel and search queries.

If the algorithm worked perfectly, then every video YouTube recommended would be incredibly interesting for us, however, this was not the case for Nick Brown. Nick is a teenager who spends lots of time on YouTube, primarily watching videos about UK politics. As a strong labour supporter, Nick spends most of his time watching videos about the party’s news. However, one Tuesday evening in December 2019, just two days before the next general election, the teen was an hour deep into a YouTube binge session gathering as much information about the upcoming election, when he was led down a strange path. “On my up next was a Maroon 5 music video” Nick stated. “I’ve never listened to Maroon 5 in my life.”

The music video is very popular on YouTube, with over 2.5 billion views. Despite the video being an obvious hit on YouTube, Nick was not so impressed. “I don’t get it” he muttered angrily, “what does the labour party have to do with Maroon 5?”.

This is a question I have been asking myself since speaking to the young teen. Why would YouTube recommend such an unrelated video? With 500 hours of videos being uploaded every minute, I understand that YouTube has a lot of footage to deal with. However, with YouTube being one of the largest and most powerful online platforms, it seems fitting to investigate what is really going on with YouTubes recommendation algorithm.

In order to uncover the truth behind how the recommendation algorithm works, I embarked on an investigation.

Fitting with the recent UK general elections in December 2019, I decided to look into what YouTube videos would be recommended when I searched ‘Jeremy Corbyn, Anti-Semitism’.

polling-station-2643466_1920
Image: By Pixabay

On December 12th 2019, the UK general election took place, which resulted in the Conservatives winning with a landslide majority. The reason for the extreme result of the 2019 election has been in speculation since it took place, with talks of Brexit and the NHS. However, one factor that may have contributed to Jeremy Corbyn’s extreme defeat is the number of allegations of anti-Semitic behaviour that have surrounded the Labour party recently, resulting in nine members of the Labour party to resign in protest.

I used a digital tool that scraped all videos that YouTube’s algorithm would recommend from my search. 1,803 videos were recommended from my search ‘Jeremy Corbyn, Anti-Semitism’, and I visualised the network of videos in to the graph below.

 

 

Screen Shot 2020-01-05 at 15.29.13
Image: by the author 

The different colours symbolise clusters of videos that are similar to each other. As I zoomed in to the blue cluster, I was equally shocked and amused at the video titles that appeared. Incredibly extreme titles such as “I broke my legs to satisfy my mom but it was not enough” or “I Like Older Men So I Got Pregnant By A Grandpa” were present. Despite the insane obscurity of the video titles, I was intrigued to click on them for that very same reason, and I was not alone in that. The videos had millions of views, with thousands of outraged comments and dislikes.

Screen Shot 2020-01-07 at 10.18.34
Image: by the author 

Despite the complete irrelevance to Jeremy Corbyn or Anti-Semitism, YouTube knows that shocking video titles are more likely to get clicked on.

This section of the graph suggests that they specifically include extreme video titles instead of finding actual relevant and personalised videos for the user. Despite the irrelevance of these videos, YouTube are obviously doing something right still, as 70% of all time spent on YouTube is occupied watching recommended videos. Guillaume Chaslot, founder of AlgoTransparency, stated that the recommendation algorithm is often not related to what the individual wants, and focuses on what is likely to get clicked on.

The investigation taught me that the algorithm just assumes that popular videos that are clicked on regularly might satisfy everyone. Pew Research centre found in a study that the 50 videos that were recommended the most times by the algorithm had been viewed over 456 million times each.

My brief investigation suggests that the algorithm isn’t really as clever as one might assume, and mainly focuses on what will get the masses to click rather than what will please the individual.