Did Youtube algorithms convey to voters that there was anti-semitism in the Labour party?

On the 12th of December, the Conservatives won the general election with an 11.2% lead over the Labour party. There has been speculation about what happened leading up to the election. An overarching theory put forward has been that Jeremy Corbyn did not represent the views of Labour voters and that the majority opinion was suppressed by the loud Corbynite minority who advocated for Corbyn’s vision of a socialist Britain.

One way in which Corbyn’s failed leadership has been said to have manifested itself is with his handling of allegations of anti-Semitism within the party. Whilst this was not a damning issue when the Conservatives were accused of islamophobia, Corbyn appears to have been much slower to respond to, and express his disdain for, the behavior than his Conservative counterparts. It was only when the leader was put under pressure that he released a public statement.

Many party members and representatives felt that their leader did not respond correctly to what is a serious issue. The sentiment became so great that in February 2019, nine MPs resigned from the party in protest. Not only was this feeling within the party leaders but also amongst the voters. Labour’s former whip, Graham Jones, said that while on the campaign trail, he encountered voters stating that the antisemitic sentiment in the party was one of the reasons why they would not be voting Labour in the 2019 general election. To the voters for whom this mattered, it would have undoubtedly served to make them feel isolated within their party.

polling station

Photo by Elliott Stallion on Unsplash

Social media was a factor of particular weight in this election campaign. Gephi – a data analysis tool that shows how videos are recommended and the link between them – can help us delve further into this point. Using YouTube as a platform, due to its great popularity, we can attempt to look into the sort of material the voter is being exposed to by way of personalised recommendations. Youtube uses the user’s history in the algorithm in order to give personalised recommendations.The aim of the investigation is to see how these videos are linked and the potential impact it has on the voters.  To specify the search, I used the keywords ‘Jeremy Corbyn’ and ‘Anti-Semitism’.

 

What does the graph show?  

Within the network, there was a focus on the graph below. This cluster in the network showed the highest volume of videos recommended that were related to the theme of anti-Semitism. Among the name Jeremy Corbyn, the names Theresa May and Nigel Farage appeared several times. This can be attributed to the fact that they were the running opponents to Jeremy Corbyn. The themes that showed up were anti-Semitism, UKIP, and Palestinian conflict.

gephi graph

Image: by the author

Thus, from this evidence, we can move to infer that – although it does no make a conclusive case that the videos impacted the voters, from the videos – we can see how the recommendations were centred around the topics above. Further to this, the YouTube recommendations move past the title and are also based on the content of the videos. For example, in the BBC Newsnight interview, there was discussions on the Labour party’s stance on the Israeli-Palestinian conflict. This video was one of the main nodes within the network. The words ‘Israel’ or ‘Palestine’ did not appear in the title. From here, the algorithm then recommends a video with ‘Palestine’ in the title. This suggests that the algorithm also considers the content as well as the title.

The central node in this cluster is The Nigel Farage Show, where he discusses his stance on Jeremy Corbyn. When the tabloid began running the anti-Semitism story, this gave Farage ample opportunity to attack his running opponent. Several videos within the cluster are opponents who are publicly speaking out against the candidate. From a voters perspective, this exacerbates the story. If there was any doubt of the candidate’s stance on the issue, the recommendation algorithm creates a higher shadow of doubt by presenting strong sentiment from other leaders. Thus, the negative perception fostered by the initial video which was likely directly about the issue, is further developed by the recommendations of videos with Corbyn’s opponents berating him for it.

 

What does the network show?  

This network demonstrates the power that social media algorithms can have on the voters. Digital campaigning has become an integral part of political campaigning globally, and has been utilised during this general election. While analysing the recommendation algorithm, one must consider to what extent these platforms are working against the politicians. In the case of the cluster examined, the specific data points, in relation to the topics recommended, evidently served to hurt Corbyn’s public perception. Any voter with doubts about the magnitude of the anti-semitism allegations would have subsequently been pushed further videos discussing it; likely with opposition politicians giving their opinions.

The existence of, or extent of, the anti-semitism within the party is not in question nor the issue. The point of this investigation is to analyse to what extent Youtube’s algorithm can influence the voters decision. Platforms like YouTube are undoubtedly a large content provider for voters in the modern age. From the information gathered, it appears that the recommendations are indeed based off similar themes within either the title or the content; thus, where a video with an opinion is viewed or searched, this might produce recommendations that serve to reinforce that opinion.

However, it also undoubtedly works on the contrary. So positive Jeremy Corbyn videos would be recommended if one searched for this. Ultimately, in tense political climates, the negative sentiment is always repeated and highlighted and in this case, YouTube algorithms facilitated the promotion of content that pushed onto voters the discussion about anti-semitism within the Labour Party.