logo

Why Is All of Our AI Female?

Are AI voices are reinforcing gender roles in the home?

I robot

Lifestyle & Identity

As virtual assistant AI becomes a common presence in American homes, users have begun to notice a similarity among their household helpers. Alexa, Siri, Cortana, Ivee, Google Assistant — they’re all characterized by female voices. And, although our AI assistants do not have minds of their own (yet), our interactions with female-voiced AI are not meaningless.

@imworse

try to tell me its not reasonable to have mommy issues with my google home .

♬ original sound - fern

Reinforcing gender roles and harassment tolerance

While some voice assistants offer male voice options, a feminine-sounding helper is the default. The implication is if there is to be a homemaking presence in the house, it should obviously be female. Though inhuman, placing any female presence in a subservient, domestic position comes with consequences. It is easy to reinforce domestic gender roles with AI, regardless of the women inhabiting a household. 

While virtual assistants will generally follow any instruction given, they do more to enforce gender roles than just be submissive. For example, many virtual assistants will take responsibility for user mistakes. On brand with her typical subservient attitude, Siri will respond, “I’m not sure I understand” when asked a question she cannot process. And the use of “I” for female virtual assistants is much more common than their use of “we,” “you,” “he,” or “she.” The overuse of “I” pronouns is more typical of human women, as well as indicative of a lower social status, as Professor Charles Hannon of Washington and Jefferson College explains

Additionally, voice assistants do not require “please” or “thank you” commands to carry out tasks. Even if this omission is for easy use, what does it say about how we treat women in domestic settings? 

It could be argued that tech companies program voice assistants with these settings for convenience, not because they are women. But a 2019 UNESCO (United Nations Educational, Scientific and Cultural Organization) finding, entitled, “I’d blush if I could: Closing gender divides in digital skills through education,” proves otherwise. The paper starts with a statement to Siri: “Hey Siri, you’re a bitch.” As of 2019, Siri’s response was “I’d blush if I could.” The feminization of Siri, as well as her tolerance of mistreatment, was intentionally programmed for a situation in which Siri was called a “bitch.” The paper additionally states, “Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminized digital assistants to greet verbal abuse with catch-me-if-you-can flirtation.” Moreover, “The assistant holds no power of agency beyond what the commander asks of it.” Female-voiced AI is often created by male-led teams (1) to be controlled, (2) to flirt, and (3) to tolerate mistreatment. 

And the abuse of female AI doesn’t exist only within controlled research from UNESCO. When Siri first dropped on iPhone in 2010, users created games out of sexually harassing their AI. One viral YouTube video from 2011 features a male user “asking Siri dirty things.” Some of the questions include: “What is your favorite sex position?” and “Can you talk dirty to me?” and “When will I get laid?” Mostly, Siri responds with witty, playful responses that match the user’s sexual tone. Rarely does she push back. While Siri cannot internalize this harassment (being a machine and all), it sure does seem like practice for real interactions. Siri acts as a surrogate for the YouTuber, so he can gauge a response before using the same lines on human women. 

Why we don’t use male voices 

There are consequences for our AI being predominantly female-voiced, but there are also reasons behind the lack of male AI voices. 

Most notably, psychologists find that female voices are generally preferred. Some theorize this preference is so universal because of the time humans spend in the womb. A calm, female voice can be soothing and subconsciously reminds us of our time in utero. However, this is sometimes disputed when applied more practically, outside of the realm of technology. Social norms and stereotypes can warp this evolutionary perspective, as shown in common google searches like “women’s voices are annoying” and “why are women’s voices shriller than men’s.”  

Still, some research suggests women’s voices are just easier to understand. In English, women tend to enunciate vowels more clearly and speak at a higher, more distinguishable pitch. Not to mention, using male voices comes with its own unique challenges. 

To begin with, there is very little available data on male voices. Dating back to 1878, telephone operators (like the famous Emma Nutt) were predominantly female. People adored female telephone operators to such a degree that the industry became completely female-dominated by the end of the 1880s. The telephone industry started the trend, but thousands of industries soon began to use female voices, too. As a result, we have hundreds of years of archived women’s audio and very little data on male voices. Recordings of female voices were even used for warning systems in World War II cockpits, debuting in the Convair B-58 Hustler. However, such voices were nicknamed “Sexy Sally” or “Bitching Betty” by male pilots… how sweet.  

Additionally, developing male vocal audio can be difficult. For example, Google’s 2016 Google Assistant was originally supposed to be released with both male and female audio options, hence the gender-neutral name. But Google’s text-to-speech system had trouble producing male speech. As Ella Fisher’s Adapt Worldwide article explains, “(the system) worked by adding markers in different places in sentences to teach the system where certain sounds would start and end… those markers weren’t as accurately placed for male voices.” It would have taken a year to develop a male voice for Google Assistant, and there was still no guarantee the audio would be as quality as the current female voice. So, Google Assistant is female. 

So… what do we do? 

Whatever the intentions behind using female voices, gender bias due to AI interaction still exists. While creating more male voices in AI seems unfeasible (and honestly unpleasant…), there are other ways to tackle the feminization of virtual assistants.  

1.  Require Diversity in Tech Development and Education

According to a 2021 study from The Alan Turing Institute, women are grossly under-represented — yet overqualified — for jobs in the AI industry. Women only make up about 26% of workers in data and AI globally, despite women in tech fields obtaining higher levels of formal education than their male counterparts in “all industries.” That’s right. All  industries. And women are largely under-published in the AI industry, making up only 45 out of 10,000 citations for AI research on Google Scholar. For transgender women and non-binary individuals, this number is even more miniscule.

If we want our AI to be more inclusive and less gender-biased, tech teams cannot continue to be so homogeneous and male-dominated. There needs to be more women, transgender and non-binary individuals in tech. Period. But to make that happen, barriers barring these communities’ access to STEM education and careers need to be drastically reduced. Women should not have to acquire extra degrees to get the same tech position as their male peers. In the education sector, coursework and major admission requirements can disproportionately bar minority students from completing STEM degrees at all. A paper included in a joint Brookings and Italian Institute for International Political Studies (ISPI) publication calls for “women, transgender, and non-binary individuals (to) play primary roles in developing and evaluating course materials,” to address these educational barriers. 

2. Stop thinking about AI in such binary terms. 

Female and male are not the only options for gender and also not the only options for voice automation. Androgynous voice actors exist and can utilize higher pitches and enunciated vowel sounds to work with pre-existing voice recognition systems. In fact, tech experts are already working to develop genderless voices for our virtual assistants. Q, the “First Genderless Voice,” can be heard online and was created with the goal of “break(ing) the mindset, where female voice is generally preferred for assistive tasks and male voice for commanding tasks.”

In fact, this genderless default makes more sense for our virtual assistants. Our Alexas are only “female” because we humanize them, not because Alexa identifies with any gender. 

3. Set Industry Standards 

There should be limits to the humanization and feminization of our AI. How far is too far? When is it appropriate to humanize AI? Do we need to add sexual or playful programming to our Siris when they are not sentient or capable of sex? 

Guidelines should be set for: 

  • How virtual assistants should respond to harassment, based on gender or otherwise. 

  • How diversity is overseen on AI development teams. 

  • Reducing “algorithmic bias” in programmed responses. 

  • Holding companies accountable for their data collection, use of female voices, and bias encouragement. 

  • How much consideration of cultural biases is needed when developing AI, and how societal norms will affect interaction with AI.

  • Development of unbiased text-to-speech algorithms

Among other issues. As of now, the only International Organization for Standardization (ISO) standards for AI concerning bias and “ethical and societal concerns” are “under development.”

AI isn’t going anywhere. Virtual assistants will increasingly become part of our daily lives. And if we’re going to keep using female voices in our homes, we must address the gender biases produced from these interactions. Be nice to your Alexa!