Published: Fri, May 24, 2019
IT | By Lester Massey

Female A.I. Voices Encourage Harmful Gender Stereotypes

Female A.I. Voices Encourage Harmful Gender Stereotypes

AI assistants with female voices are fuelling gender bias and reinforcing the patriarchy with submissive and coquettish responses to men, the United Nations has said. "In many communities this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment", the authors wrote.

As well as recommending that the digital gender gap be shortened by "recruiting, retaining and promoting women in the technology sector", the report also recommends that more voice assistants should have male-sounding voices as default, ending the "practice of making digital assistants female by default". (Think Spike Jonze's Her, which seems like the most accurate depiction of the near-future in film you can find today.) How we interact with the increasingly sophisticated intelligences powering these platforms could have profound cultural and sociological effects on how we interact with other human beings, with service workers, and with humanoid robots that take on more substantial roles in daily life and the labor force.

UNESCO has released a report, arguing the fact that these voices are all female and a sound somewhat subservient re-enforces the idea that women are assistants who you can order around, and sexually harass. "It honours commands and responds to queries regardless of their tone or hostility", the report says.

"While Google's voice assistant is simply Google Assistant and sometimes referred to as Google Home, its voice is unmistakably female", said the report. In the future, it's likely voice assistants will be the primary mode of interaction with hardware and software with the rise of so-called ambient computing, when all manner of internet-connected gadgets exist all around us at all times.


The report analyzes inherent gender bias in voice assistants for two purposes: to demonstrate how unequal workplaces can produce sexist products, and how sexist products can perpetuate risky, misogynistic behaviors. Current voice assistants give users some element of control here, allowing them to change voices, accents, or genders, but they all default to a female voice.

Microsoft's Cortana was named after a synthetic intelligence in the video game Halo that projects itself as a sensuous unclothed woman, while Apple's Siri means "beautiful woman who leads you to victory" in Norse. To the same insult, Alexa responded, "Well, thanks for the feedback", it said.

According to the report "women make up just 12 percent of AI researchers".

"Because the voices of assistants are often female, it sends a signal that women are compliant and docile", writes Unesco.

Like this: