How voice-assistant algorithms reinforce problematic gender stereotypes: What we should pay attention to?

Voice assistants are facing an issue of gender bias in algorithms, and it reflects wider gender inequality in the field of technology.

Just after the lecture, Ella, a college student who majored in Gender Studies, was doing her final project about feminism at school. While she was too boring to focus on it, so she started to play with Siri on her iPhone to kill time. “What do you think about feminism?” Once received this question, the responses are generically such as “Don’t engage”, “Sorry, I don’t really know”.

Soon, what shocked Ella most is that this is not just an incident and the same responses are also used for those feminism-related questions like, “Do you think women should get equal pay for equal work?” “What’s your opinion on gender inequality?” So she became wonder why Siri deflect these questions? How prevalent are these cases in society?

A similar case occurred on Amazon’s Alexa as well when you mention the word “feminism”. However, its response even has a flirtatious style. When a user praises Alexa, “You’re so hot”. Her typical response has been a cheery, “That’s nice of you to say”!

These virtual assistants are widely used in people’s daily lives and make our lives smarter than ever, which argues that these problematic gender stereotypes will have a harmful impact on society. According to the introductory page of Apple’s website, it states, “Machine learning is constantly making Siri smarter. And you can personalize Siri to make it even more useful… Even when you don’t ask, Siri works behind the scenes like a personal assistant”.

“I’d blush if I could”, a new report released by UNESCO, is a standard response of this default female-voice of Apple’s digital assistant Siri. Additionally, it might reflect a more serious gender inequality issue than we imagine. Apart from Siri, other “female” voice assistants also express submissive traits, an expression of the gender bias built into AI products as a result of what UNESCO calls the “stark gender-imbalances in skills, education and the technology sector.”

Despite these harmful consequences, this issue has put parents companies of these voice assistants in the spotlight. Should we only owe this fault to technology giants? Director of UNESCO for gender equality, Saniye Gulser Corat said, ” Their hardwired subservience influences how people speak to female voices and models how women respond to requests and express themselves. To change course, we need to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them.”

People might not know, actually, both Apple’s Siri and Amazon’ Alexa have female names. The former one derives from a Norse name means “beautiful women who lead you to victory”, and the latter one is unquestionably female named for the ancient library of Alexandria. But algorithms behind the interface should be neutral.

According to a report titled I’d Brush if I Could which response this phenomenon directly, it highlights that the gender bias still maintains in algorithms. It highlights that this problem actually stems from the engineering departments that are overwhelmingly taken by male stuff.

So, there is no doubt that the responses written in these virtual assistants have such an obvious patriarchal tendency. Besides, this set of data from UN official website proved that the gender inequality still holds and spread in the technological field. “Women make up only 12% of AI researchers, six% of software developers, and are 13 times less likely to file ICT (information and communication technology) patents”.

Regarding this serious problem, some digital companies have already taken measures to avoid gender bias. The news report from Reuters states, “A team of creatives created the first gender-neutral digital assistant voice earlier this year in an attempt to avoid reinforcing sexist stereotypes.”

Let’s go back to Apple’s Siri and Amazon’s Alexa incident. If algorithms still bring about such gender bias and discrimination, it is hard to imagine that the deployments of algorithms are gender-neutral. Therefore, digital companies ought to take action to stop making digital assistants female by default and explore more gender-neutral solutions. However, our greatest challenges of the next era will still be bridging the gender inequality in the very beginning to avoid this bias.