“YouTube is a platform where we can express ourselves and collaborate with each other,” says Ethan (20) about the social networking site. Being the student that he is, he is sat at a coffee shop with his laptop open in front of him. While telling me of his experience using YouTube, he is quickly scrolling through his subscriptions box, where there are videos ranging from gaming videos to makeup tutorials. “I don’t really use YouTube to upload videos myself, but I think it’s just a really good place to explore whatever you want to,” he comments on his account. However, when asked about the algorithms going into the process of demonetising videos, the economics student is out of his element. “I have no idea how it works because it doesn’t really affect me, but it seems like it’s a lucrative business to be in right now.”
While some ‘YouTubers’ do indeed make a lucrative living, a question arises: Is this something that can be achieved by anyone?
According to the creators behind the Rainbow Coalition, the answer is no. In a video titled “WE’RE SUING GOOGLE/YOUTUBE – And here’s why…” posted 14 August 2019, eight LGBT+ YouTube creators announce that they are suing the companies due to biased algorithms demonetising queer content. Their concerns about this bias involves not only their livelihood being threatened, but also their content being categorised in a way that in some cases restricts it and makes it difficult for the creators to reach their audience. This is a serious claim to make, as it accuses YouTube of enabling a potentially homophobic algorithm. Given the fact that the platform is one of the biggest social media networks on the market with over a billion users, the accusation is something that the company should be taking immediate action to fix, especially in today’s political climate. Yet, these claims are far from new.
@TylerOakley, a highly influential LGBT+ YouTuber tweeted in March 2017 that “one of my recent videos “8 Black LGBTQ+ Trailblazers Who Inspired Me” is blocked …”, meaning that it would not reach audiences that browse in Restricted mode. This raises the question of why. Explained in the YouTube Help Centre, Restricted mode is a function that, when switched on, aims to hide content concerned with drugs and alcohol, sexual situations (referring to “overly detailed conversations or depictions of sex or sexual activity”), violence, mature subjects (“relating to terrorism, war, crime and political conflicts that resulted in death or serious injury…”), profane and mature language and incendiary and demeaning content. Where Oakley could have broken these guidelines, however, remains unclear.
YouTube, on the other hand, has been struggling to keep up with these issues. In an interview with YouTube creator Alfie Deyes, CEO of YouTube, Susan Wojcicki, states that there are no policies in place to demonetise queer content based on queer terminology in the title. This again is something that has since been tested by a group of YouTube users through reverse engineering. In short, reverse engineering means replicating a process, in this case the algorithm in question. If done successfully, it can be used as a research method for understanding a certain object. When these YouTubers, including data researcher Sealow, conducted their research, they found that 33% of the queer titles tested automatically got demonetised. Even more shocking, when the LGBT+ terminology was swapped for the words ‘happy’ or ‘friend’, all demonetised content was instantly monetised. This contradicts both the Wojcicki’s statement, as well as YouTube’s guidelines on monetisation.
Madeleine, 19-year-old student in London, helped shine some light on what the public knowledge and opinion on the subject is. When informed about this issue, she put her history book back on her table and seemed perplexed. “I had no idea that this was going on,” she admitted. Providing the student with the research on the subject, as well as a basic introduction to algorithms she could not help but look surprised. “I’ve actually always thought of algorithms as just numbers and math, and therefore neutral.” This is not something that she is alone about thinking. As our technology advances, it becomes harder for the average person to keep up with the processes going on underneath the surface. This can make us forget a very important fact: that algorithms are programmed by humans.
In an ideal world, algorithms would indeed be neutral, and so represent equality. Until then, though, algorithms are programmed by humans. This means that humans may, intentionally or unintentionally, perpetuate certain biases found outside of a computer science domain. This has been seen in case and case again, maybe most clearly in the case of facial recognition software. The way that facial recognition still fails in identifying people of colour to the same extent that it identifies white people goes to show that there are indeed some imbedded biases in algorithms, as is also evident in the case of queer YouTube creators and their content.As the conflict between YouTube and its queer creators continues, it is important to remember that there still doesn’t exist a perfect algorithm. Not saying that the LBGT+ community should take the issue lightly, but there might be room for more communication from both sides. While working towards a future with unbiased algorithms, it may prove more efficient to be transparent about algorithmic issues so that users can learn about the processes that goes into them, as well as potentially understanding the difficulties in engineering a neutral algorithm. This is supported by Ethan and Madeleine, as neither of them had encountered this problematic situation before. As Ethan notes, “I’ve literally never come across this before. As a gay person myself I would want to know more about how these problems are tackled before I blindly support a homophobic algorithm. But like, I’m not that surprised either, because it only reflects the inequality in the real world, doesn’t it?”