Not So Black & White: The Colour of Google’s Data

The Google Search Engine – Photo by: PhotoMIX Ltd.

In this day and age, we leave many things to chance. If that is beyond belief, take a moment and try googling yourself. If you appear on the first page of results, then congratulations! You must have done something considerably fantastic enough for the internet -and therefore the world- to take notice. Or perhaps your achievements have left you a few pages shy of that glorious first display of results. Nothing wrong with that, you are probably still an accomplished person. That, or you may just be ‘white’, according to recent investigations into Google’s algorithms.

Ah yes, Google. Everyone’s favourite (and oftentimes only) servant in their back pocket. Google is the search engine that many would consider the cornerstone of our highly digitized society. Simply enter a name or question, and voilà!, the answer magically presents itself in a matter of milliseconds. At best, what you are looking for will appear in that little box above all results, nicely called the ‘knowledge panel’. At worst, you might just have to spend a precious few seconds more, scrolling down or even going through a few pages of results before you get what you want. ‘A small price to pay’, some may say, as it has never been easier to get information. Anyone with access to a screen and the internet can come through with at least one more thing to add to their knowledge bank within seconds. Truly revolutionary, this search engine! Surely, it can do no wrong?

Unfortunately, it can, according to experts. Recent analysis of the methods used by Google’s search engine revealed certain unsettling biases. These methods rely heavily on algorithms: complicated formulas and equations handled by computers. It takes information, runs it through a checking system and achieves a calculated answer. But studies conducted this year alone show stark flaws in the system. In July 2019, a New York City justice reform agency, the Center for Court Innovation, published a study on risk assessment and racial fairness using algorithms. It revealed that such algorithms favoured white people, rendering them ‘safer’ in the American justice system, while Black and Hispanic citizens were deemed significantly ‘riskier’. This meant that certain assumptions were made about people of colour, and that they were more likely to engage and re-engage in criminal activity. This algorithm is similar to the one used by Google’s search engine, prioritizing results that are more commonly associated with ‘whiteness’. Sure enough, if you use Google Images to search up ‘white couples’ and ‘black couples’, the former presents a higher number of interracial couples while the latter presents exclusively black couples.

This issue is only amplified when considering how the public views search engine results. A survey conducted in April revealed that, out of 1400 searchers across various age groups, only 7% of respondents go beyond the first page of results. Moreover, 75% of all participants relied on the first two results on any given query. If search results are organized to prioritize information presenting certain influences, this can be problematic in the global usage of the search engine.

How is it that a machine, with no eyes to speak of, let alone prejudices, display racist tendencies? The answer is simpler than you think, says political journalist Stephen Bush. It is because these algorithms are ‘made by people’. “[…]any algorithm is only as good as the assumptions that are fed into it”, claims Bush. In other words, these mathematical formulas are able to churn out answers, thanks to the information that it is given. And who in the world has more information than Google? Over the last few years, the company’s executives and programmers have been subject to public scrutiny for allowing such biases to exist on a supposedly objective and accurate search engine. Among these accusations is the common belief that a large majority of data being fed into the algorithm is unmonitored and inappropriate.

To the credit of the multi-billion-dollar company, Google has taken measures to appease the masses and re-evaluate the process that, well, processes information. An adviser of Google’s search division, Danny Sullivan, made a twitter post implying that it is not the system to blame. His post suggested that any racial biases presented via the search engine are caused by users ‘mentioning’ key racial terms associated with specific searches.

Mr. Sullivan’s response to racial biases regarding the ‘couples’ search. Taken from his Twitter Profile: @dannysullivan

Additionally, one of Google’s senior engineers, Gregory Coppola came forward to state that any prejudices and segregation present in the search engine’s results are not always intentional.

“Algorithms – the series of commands to computers- don’t write themselves. People may write their own opinions into an algorithm, knowingly or otherwise.”

Gregory Coppola, Senior Google Engineer, in an interview with Mind Matters, 2019

Coppola proposes that it is not the fault of the system for any biases, but the fault of its developers and users. Google further acknowledged the presence of the problem when they revealed that 1 in 4 search results was turning up offensive or misleading content in 2017. Since then, the company has committed to taking this matter seriously and restructuring how content is analysed and presented. It has been two years since then, which leads us to the question: What has changed?

            Surely, this is a difficult question to answer. With Google holding at least 75% of global web search volume since 2013, the dominant presence of Google’s search engine is clear. It is unlikely that people will stop using it just because of perceived colour biases. At the same time, it is unclear how these racial biases are being handled internally. The company remains discreet in how their algorithms work, and it remains unclear how they have ‘restructured’ their search engine processes. Perhaps, all the layman can do is be patient but vigilant, pointing out a problem as it appears and hopefully kicking up enough dust to get into Google’s eyes. Time runs short for the company to quietly turn its veiled dials and levers as large political players and the masses demand fairer and more balanced algorithms. Perhaps it is because everyone wants answers to something not available through a quick search on a web browser.

“An algorithm is essentially just a series of assumptions with a result at the end”, says Bush. But it appears assumptions are no longer satisfactory. Maybe the people want something simpler than a few hundred thousand assumptions generated in a third of a second by Google. Maybe this time, people just want a single definitive answer from Google.