Female Voice Assistants Reinforce Harmful Gender Stereotypes Says UN

A new report calls on tech companies to take responsibility.

Voice-activated assistant technology like Siri and Alexa is most commonly programmed to have a female voice. But now the UN has called for big tech companies to change that.

In a report called "I'd Blush if I could: Closing Gender Divides in Digital Skills Through Education," the United Nations ask companies like Google, Amazon, Apple, and Microsoft to stop making digital assistants female by default.

RELATED: ALEXA, CORTANA, SIRI? HOW TO CHOOSE A SMART VOICE ASSISTANT FOR YOUR SMART HOME

Advertisement

They say doing so reinforces harmful stereotypes of women as submissive and provides an opportunity for verbal abuse without consequence.

Advertisement

The report recommends the voices be genderless - something Google has attempted to do by labeling its voices orange and red rather than male or female.

Advertisement

Women need better access to tech skills

The report also notes that women are 25 percent less likely to have basic digital skills than men and requests that powerful tech companies address this divide. You don’t need to go far into the internet to find an example of people abusing voice-activated assistant like Siri.

Advertisement

While the abuse is problematic so is the responses the systems have been designed to respond with.

Advertisement

At one point, calling Siri a slut would have received the response "I'd blush if I could." Though it is thought that the response has now been changed.

Advertisement

The table from the report below shows the responses to various offensive names being directed a four of the major voice assistants.

Advertisement
Female Voice Assistants Reinforce Harmful Gender Stereotypes Says UN
Source: Unesco

Now is the time for change

The results speak for themselves that there need to be major changes. This is exactly the time for the demands to be made of the companies responsible for these devices as they become more and more ingrained in our lives.

Amazon is even reportedly creating a device that will be able to understand your mood. Using voice recognition the wearable wrist device can learn how your voice changes under different emotional circumstances and go as far as to offer advice on how to deal with social situations.

Artificial Intelligence algorithms have already shown racial and gender bias, but we must remember these systems aren't born neutral. Those biases come from the way they are developed, and the data sets they are fed.  

STEM pathways early

As many have pointed out the best way to deal with this bias is to have racial and gender diverse teams working on projects.  To achieve this, people from diverse backgrounds and experiences need to be given clear access and pathways through STEM careers.

Advertisement