Female Voice Assistants Are Sexist Argues New UN Report

Obsequious, coy, and above all female virtual assistants like Alexa, Siri, and the Google Assistant are a widespread example of sexism, a new UN report argues, blasting gender bias coded into tech products. The use of default female voices for digital assistants, and their programmed reactions to insults and inappropriate requests, could be contributing to a growing gender gap, it's suggested.

Advertisement

Female voices are generally the interface of choice for digital assistants like Amazon's Alexa, the Google Assistant on phones and smart speakers, and Apple's Siri on the iPhone and other devices. In some cases, manufacturers offer alternative voices, which can include different genders and accents. However it's unclear how many people actually set their assistant to a different voice, or even know that the option exists.

As a result, for most users the assistant on their platform of choice is associated with a female identity. That's despite the underlying AI powering such services having no gender. Rather than being an interface afterthought, a United Nations study says, this pervasive default could be reinforcing or even creating "a digital skills gender gap."

Advertisement

For example, assistants' reactions to insults, or inappropriate comments or requests, is criticized. When Siri says "I'd blush if I could" in response to being called a derogatory slur, the UN report suggests, it promotes an idea of "submissiveness in the face of gender abuse."

That could be helping to discourage women and girls to get into technology fields, the authors of the report suggest. "Siri's 'female' obsequiousness – and the servility expressed by so many other digital assistants projected as young women – provides a powerful illustration of gender biases coded into technology products, pervasive in the technology sector and apparent in digital skills education," they argue.

Apple and other tech companies have adjusted how their assistants react to suggestions and questions beyond the typical use-cases envisaged for the systems. Rather than blushing, for example, Siri now has a more underwhelming "I don't know how to respond to that" response for sexist comments. Other changes in recent years have included encouraging users to say "please" and "thank you" as they interact with AIs.

All the same, the UN report counters that simply by presenting as female, such technologies potentially undermine women in the industry. While manufacturers have pointed to research that indicates users prefer – and sometimes find more useful – interacting with a female voice over a male version, the UN argues that gender bias plays a significant role there, too.

Advertisement

It's not just virtual assistants, however, which could be contributing to the problem. The UN also highlights the widespread use of female chatbots and virtual agents, again which are based on technology with no inherent gender, but which use a female identity for their interactions with human users. However voice-led experiences of assistants are the big growth area, the authors highlight, and the one with the greatest potential to be problematic.

The report is likely to be controversial, with some undoubtedly countering that it anthropomorphizes technology to push a gender argument. Encouraging more people to consider ICT and engineering roles, however, seems like generally a good thing. Making digital assistants that are less servile and not necessarily identifiably female by default would be a solid way to start that, the 145 page report suggests.

Recommended

Advertisement