The degendering of voice assistants
In 2016, U.S. consulting firm Gartner predicted that by 2020 the average citizen would have more conversations with bots than with his or her spouse or partner. They were not far off. Virtual assistants are increasingly present and integrated in our daily lives. If a few years ago it was strange for us to talk to robots, it is now commonplace to ask Siri what the most popular pop song is, to get help from Alexa on making a cheesecake, or asking Cortana the fastest way to get from point A to point B. Virtual voice assistants are increasingly more human, but with something in common: their sex.
In less than ten years, the frequency of voice-based Internet queries increased 35 times and currently accounts for close to one-fifth of mobile phone searches, a figure that is expected to hit 50 percent by the end of the year. Virtual Assistants today manage more than one billion tasks a month, from the most mundane — like checking the time — to more critical activities like contacting emergency services.
As this functionality has been integrated into technology, there have been attempts to give the machine’s voice a human quality, looking to create a personality that will make the technology more approachable. A common theme, at least initially, is that for the most part the virtual voices are female. Today, almost all systems offer options for male voices, but by default women's voices still dominate the landscape. Robots are trained by humans to answer an endless number of questions; consequently, they are imbued with their creators’ gender biases.
An UNESCO report attempts to provide an explanation to this trend. The title of the research, I’d blush if I could, is how Siri would have responded to being called a vulgarly misogynistic insult (starting with a ‘b’). The study starts off with this sexist anecdote as it analyzes the role of education in helping to remedy gender bias in technology. The United Nations entity maintains that virtual assistants’ feminine nature and the subservience they express is a clear example of how technology contributes to the perpetuation of these biases.
The study also notes that the trend of assigning virtual assistants a feminine gender occurs in a context of gender imbalance in technology companies, where men account for between 60 and 75 percent of the sector’s total workforce. Bridging this divide and including more women in machine learning processes is fundamental in order to curb the propagation of these unwelcome cultural stereotypes.
Blue, BBVA’s gender-neutral assistant
One way to prevent these problems could be found in gender-neutral virtual assistants, which illustrates that intelligent technologies do not need to have an assigned gender. Some companies have made the decision to ‘personify’ animal versions of virtual assistants, maintaining human aspects but avoiding binary questions about gender.
And while it may seem necessary to attribute a gender to the things around us, it is nothing more than a social construct. Reality demonstrates that we constantly interact with elements in our environment without classifying them as male or female or associating societally defined gender constraints.
This reasoning underpins the creation of Blue, the voice assistant developed entirely by BBVA. The process of defining its personality took place with users from four countries (Argentina, Colombia, Spain, and Mexico), and was then followed by a comprehensive analysis phase, which included psychological and positioning methods to decide on the assistant’s personality and tone: consistent, ethical, responsible, and gender-neutral. Blue’s personality was built upon the foundation of a non-human assistant, seeking a balance between inclusive language, the need for clarity, and space limitations: This approach contributes to liberating Blue from unjustified social stereotypes.
The human mind is culturally influenced to try to determine if a voice belongs to a man or a woman, and therefore can begin the process of visually imagining who might be hidden behind the voice. This is a line that becomes blurred when it comes to digital voices. Perhaps Blue, as an example of a gender-neutral voice assistant, might play tricks on the human mind, causing some confusion as we look to identify a particular gender, but its potential — to create greater opportunity and put an end to gender bias — is indisputable.