WHERE ARE YOUTUBE’S RECOMMENDATIONS TAKING US?

Corbyn’s leadership of the the Labor party since 2015, has turned it into a “welcoming refuge for anti-semitism”. Following this statement made by the the Jewish Labor Movement, the party’s responses towards such accusations have been characterised by denial.  

Corbyn has intensively been associated with antisemite behaviour. With more than 5,000 stories about Corbyn, anti-semitism and Labor since 2015 revealed by the book Bad News for Labor: antisemitism, the party and public belief, growing Jewish concerns have sprung. A recent poll made by The Jewish Chronicle exposed that 47% of British Jews would consider emigrating should he win. This would represent the largest Jewish exodus from a Western country since the 30s.

Overall, Corbyn’s media coverage has been hostile. The influence of social media platforms over newsrooms have associated him with antisemitism. The openness of these platforms has empowered the public to shape information content as much as they shape our perception of things. Increasingly, websites are being powered by algorithms that Gillepsie described as being influencers of our choices, ideas and opportunities. 

Algorithms are used to create recommendation systems. The data collection and filtering promote certain pieces of content over others. Promoted as optimisers of the user’s experience, the founder of Algo Transparency Guillaume Chaslot believes that the motivations behind the recommendation systems are flawed and often unrelated to what the viewer wants.

The YouTube App on a smartphone – Photo by: freestocks.org

YouTube maximises different relationships from users to advertisers and creates monetary value out of the watch time. After working several months on YouTube’s recommendation system, Chaslot assessed that “the problem is that the AI isn’t built to help you get what you want, it’s built to get you addicted to YouTube”. The platform’s interest comes first, and as YouTube becomes more central in people’s news consumption, such use of algorithms may threaten our ideas and perceptions.

Hence this technology calls for questioning the system’s reliability and consequences for the general public’s best interest. 

To understand YouTube’s recommendation system, we explored video networks created by repurposing data from YouTube’s ‘related videos’ feature through the use of the controversial query, ‘Jeremy Corbyn’ and ‘anti-semitism’. We established a network of the videos and their associations through YouTube’s ‘related videos’ feature. Our results showed that with around 1,800 different videos recommended, less than 15% were directly related to our query. 

Gephi Graph cluster focused on Brexit and conservative position – Photo by: Emma Neveux

With more than 75% of recommended videos unrelated to our query, we captured one specific pattern that seemed to significantly occupy the interest of the recommendation system: 24% of the recommended videos revolved around the Labor party and Corbyn, but more specifically on the heated Brexit issue. Stepping away from the relationship between Corbyn and antisemitism, for its own interest YouTube promotes a different political debate to trigger people’s attention

These recommendations displayed predominant conservative names such as Nigel Farage and Theresa May, always in the context of discrediting Corbyn’s political acts and party through his position towards the Brexit issue. The promoted videos mainly came from reliable sources such as The Guardian News, or BBC Newsnight, adding credibility to the videos, and impacting the users’ perceptions. Videos downgrade Corbyn’s integrity and reliability through titles such as: “Theresa May tells Corbyn to quit as Labor leader in final exchange” or through discrediting contents. 

With his unclear position, Corbyn’s stance towards Brexit provokes critics and misunderstandings. After being critical of the European Union, he continued to support the membership in the 2016 referendum, and by 2019 he endorsed a referendum on any Brexit withdrawal agreement with a personal stance of neutrality. This grey area in the public’s eyes offered his rivals an opportunity to focus their energy on discrediting him through accusations of uncertainty. YouTube’s recommendation system embraces this by replacing our initial controversial issue with another one to secure viewers’ watch time. Chaslot expressed to The Guardian that “YouTube is something that looks like reality, but it is distorted to make you spend more time online” and it is not “optimising for what is balanced or healthy for democracy”. 

We are only one click away from getting distracted away for hours with a recommendation system putting forward unrelated topics linked to Corbyn’s unpopularity. 

Almost 10% of the recommended videos brought up issues revolving around the Israeli and Palestinian conflict or islamophobia, which have no direct relationship with Corbyn and his antisemite attacks. Ali declared in an article for the Digital Information World that YouTube is “becoming a network for extremist content”. The mixed issues found in YouTube’s recommendation system and the amalgams it creates make us, like Chaslot, worry that the recommendations will drive people further to extremes because it is in YouTube’s interest to keep us watching for as long as possible. 

Screenshot of Youtube comments – Photo by: Emma Neveux
Screenshots of YouTube comments – Photo by: Emma Neveux
Screenshot of YouTube comments – Photo by: Emma Neveux

The recommended videos all put forward controversial names like Trump and Boris Johnson regarding race and religion, as well as controversial topics like Palestine, anti-semitism, islamophobia, etc. The most-related video found mentioning Corbyn was one from Sky News: “We should be PROUD of Corbyn’s record on Palestine” in which Michael Walker emphasised heavily Corbyn’s position towards Palestine and manipulated his support to discredit him. The comments that sprung under such videos tended to be extreme and project radical position.

The recommendation system is flawed as it focuses on the watch time and endangers our perceptions of things. Borderline content is more engaging, and taken out of context, it can lead to the radicalisation of ideas.

The lack of transparency of algorithms and the weight of the recommendation systems call for their denunciation to offer people a better overview of what is actually being recommended on YouTube. The platform ‘innocently’ facilitates our navigation and distracts us from the real issues while promoting radical positions. 

Chancellor Merkel called the issue a “challenge not just for political parties but for society as a whole”, lighting up the red light on the dangers of those systems for a free and democratic society.