A United Nations report says Siri and other female smart assistants reinforce gender bias

A United Nations report indirectly accuses smart assistant providers like Apple, Google and Microsoft of reinforcing gender bias by using female assistant voices by default.

Apple’s Siri, Microsoft’s Cortana, Google’s Assistant on Home speakers and Amazon’s Alexa are by far the most popular digital assistants out there. In the vast majority of cases, all of these assistants default to a female voice. Some assistant use female voices exclusively, like Alexa, and others allow the user to change voice gender in Settings, like Siri.

In some cases, an assistant’s default voice gender depends on the user’s specific market, and Apple is a good example that—Siri uses a female voice in most countries, but she defaults to a male voice when the system language is set to Arabic, French, Dutch or British English.

From the report, titled “I’d blush if I could”:

Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘Hey’ or ‘OK’.

The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.

The title of the report (“I’d blush if I could”) used to be one of Siri’s responses to being addressed as a slut (another one: “Well, I never!”), as noted by 9to5Mac, but Apple has since changed those responses to “I don’t know how to respond to that.”

It is also concerning that a female AI helper runs the danger of giving children wrong ideas about the role of women in modern society, potentially suggesting that it’s normal for women, girls and female-gendered individuals to respond on demand.

According to Calvin Lai, a Harvard University researcher who studies unconscious bias, the gender associations people adopt are contingent on the number of times people are exposed to them. As female digital assistants spread, the frequency and volume of associations between ‘woman’ and ‘assistant’ increase dramatically.

According to Lai, the more that culture teaches people to equate women with assistants, the more real women will be seen as assistants—and penalized for not being assistant-like. This demonstrates that powerful technology can not only replicate gender inequalities, but also widen them.

I’m not sure what to think of this report other than that Apple, Google, Microsoft and Amazon are very well aware of the cultural subtext of all this—otherwise, Siri’s default voice gender wouldn’t be region-dependent—but I’m not so sure that the companies are aware that all-female assistant voices could, and probably do, reinforce gender bias, especially with kids who might over time take that as proof of a connection between a woman’s voice and subservience.

Do female assistant voices really reinforce Western gender stereotypes? What’s your take on this report? Be sure to chime in with your thoughts in the comments section down below.