Does Alexa Reflect or Reject Patriarchal Values?

Emily Daly / April 8 / Artificial Intelligence

Illustration by Tamara Siewart @t__earz

Illustration by Tamara Siewart @t__earz

From Alexa waiting on us in the living room to public service announcements informing us in the train station, have you noticed that this tranquil, feminised AI voice has ubiquitously infiltrated our lives?  

While many of us first heard a feminised AI voice in Spike Jonze’s award-winning film, Her, in which a lonely male character falls in love with a disembodied female machine, today this voice has seeped into different facets of our lives such as voice assistants, sat-navs, at supermarket checkouts and public service announcements. However, it has been widely recognised that feminised AI voices are mainly employed in assistive roles, which raises questions around gender: do female AI reinforce gender roles and affect our attitudes towards women in actuality?

In July 2019, UNESCO’s publication ‘I’d bush if I could’, a title borrowed from digital assistant Siri’s response after being called a sexually provocative term, asserted how the female AI voice perpetuates gender biases and called for technology firms, which are notably made up of a predominantly male workforce (with only 12% of AI professionals as globally female in 2019), to stop making these voice assistants defaulted to a female voice. The study also notes how feminised digital assistants' obliging and ‘eager to please’ role means that they have ‘greet verbal abuse with a catch-me-if-you-can flirtation’, such as ‘I’d Blush If I Could’; these machines are unable to stand up for themselves which encourages the idea that women should accept insults and poor treatment. 

Whether it’s Google Home or Amazon’s Alexa, feminised digital assistants have become an essential feature in living rooms across the globe, with approximately 100 million smart speakers sold globally in 2018. Enabling its consumers to carry out a range of household tasks and reminding them of important things to do, it plays the ultimate assistant in a sphere of domesticity. These feminised digital voice assistants are at our service at the touch of a button and are always brightly ready and available, which could send the signal that women are always there to serve and help. In contrast, machines such as security robots often have a male AI voice, as they are used to represent authority, which further perpetuates gender stereotypes. 

Yet to what extent does the subservient role of a female AI voice actually impact attitudes towards women in society? Although it is hard to measure and ascertain a definitive answer to this question, a case study has asserted that ‘41% of people said that speaking to Alexa is like speaking to a real person’  and according to research by Gartner, some people will have more conversations with their voice assistant than their spouses, which could mean that this behaviour theoretically translates into how we treat women. This issue could be particularly prevalent among children, as a Childwise report warned that kids who ask for information from their smart speakers could become more aggressive towards people and specifically women in the future. Despite there being no hard evidence, this does raise the concern of how these AI voices could in fact influence our treatment of women.

Another place we often experience female automated voice is at the supermarket checkout where they are employed to guide the customer through the paying process. In 2015, Tesco changed the automated voice at check-outs to a supposedly more friendly male automated voice, following complaints that the female voice was ‘irritating’ and' bossy’. This demonstrates how gendered AI voices can expose sexist attitudes and assumptions that women are too ‘bossy’ or ‘irritating’, despite the voice not belonging to real women. By contrast, users generally choose personal digital voice assistants to be female and they are not met with the same hostility - could this be because the supermarket checkout assistants are slightly more commanding and instructive? 

In 2019, a group of linguists, sound designers and researchers developed the world’s first gender-neutral AI voice- Q - which you can listen to here. The combination of experts have drawn on female, male, transgender and male voices in order to contribute to the global conversation about gender as well as appear as more inclusive of all groups and identities in society. As Wired Magazine recently said ‘this could be the genderless voice we need right now’ ; instead of choosing between the binary choice of male or female sounding voice, it is important for AI to represent the complexities of gender. At the same time, it could be argued that a genderless AI voice will help end gender bias in relation to voice assistants. 

Yet how can you create a completely gender-neutral voice which is suspended between what we conceive to be masculine and feminine? According to the sound designer Nis Norgard, this was achieved by finding a ‘zero’ voice, which was perceived to be somewhere between masculine and feminine, which is mostly determined by pitch and frequency; a ‘sweet spot’ between 145 and 175 hertz is heard as either masculine or feminine. After creating four variants, Norgard then sent research surveys to 4,500 people in Europe and from this, he found the gender-neutral Q.

In the face of real-life patriarchal issues, the implications surrounding feminised AI voices can seem redundant. Yet could we use gendered AI voices as a mirror to recognise the gendered and sexist attitudes apparent in our society and in fact learn from them? Could Q, therefore, be a marker for how we are in fact moving towards a more inclusive and diverse society with regards to gender? Whilst gendered AI voices take on an increasingly pervasive role in our lives, we must consider the possible impact they can have on further entrenching gender roles and more importantly, how we treat women in reality.


Previous
Previous

Unravelling Bitcoin and the Crypto Puzzle

Next
Next

“Misogyny? I haven’t heard that name in years”