A recent UNESCO study took its title from a sentence that Apple’s female-gendered voice-assistant, Siri, was originally programmed to say in response to users calling her a sexist name: entitled “I’d Blush If I Could.” Apple finally updated Siri’s programming only at the beginning of 2019 to “I don’t know how to respond to that.”
In light of the fact that Siri was released back in 2011, it seemed to not have bothered Apple much at all that it was assigning Siri such a coy, stereotypically feminine response that it was allowed to stay in the program for close to eight years. As the UNESCO report points out, ”Siri’s ‘female’ obsequiousness — and the servility expressed by so many other digital assistants projected as young women — provides a powerful illustration of gender biases coded into technology products."
Such biases have not only plagued the technology sector, but the entertainment industry, as well. Of course, AI does not create bias, but it does reflect the prejudices of its programming. Now, though AI is being used to counter bias.
Women on screens
Concern about the message about women’s roles that come through from how women appear or often fail to appear in films and television is what motivates the nonprofit work of the Geena Davis Institute on Gender in Media. Its tagline is: “If she can see it, she can be it.”
There is a great deal of truth to that assertion about seeing the possibilities. As the study Women in Tech: Their Current Status, What They Have Achieved and What They Want shows, the lack of visibility of women in tech, particularly in leadership roles and as speaker in conferences, is very discouraging for women, and the majority reported that they would be more likely to attend a conference if a women were featured.
Introducing an AI-powered spellcheck for bias
Geena Davis, the actress for whom the organization is named, is a devoted activist for women’s representation on the screen. Appropriately enough, she announced the adoption of an AI solution for bias in films by Disney named for her ideals, the Geena Davis Inclusion Quotient (GD-IQ) or GD-IQ: Spellcheck for Bias.
As reported in The Hollywood Reporter, Davis delivered the keynote speech this month at the Power of Inclusion Summit that was held in New Zealand during which she announced that her organization is teaming up with Walt Disney Studios to apply the AI solution that shares her name to identify possible biases in film and television scripts. The article explains:
The new tool leverages patented machine learning technology developed at the University of Southern California Viterbi School of Engineering to rapidly analyze the text of a script to determine its number of male and female characters and whether they are representative of the real population at large. The technology also can discern the numbers of characters who are people of color, LGBTQI, possess disabilities or belong to other groups typically underrepresented and failed by Hollywood storytelling.
In addition to counting up the representation of genders and races or ethnic identities of characters in a script, the tool can break down the quality of the representation in terms of the dialogue, including total number of words and the “level of sophistication of the vocabulary they use.” It also assesses “the relative social status or positions of power assigned to the characters by group.”
Davis said that her organization will be working with Disney for a “year using this tool to help their decision-making, identify opportunities to increase diversity and inclusion in the manuscripts that they receive.”
It’s interesting that Disney is doing this now, as it has come under fire in the past for depicting negative sexist and racist stereotypes in its animated films. See the videos below:
Disney has changed its story lines and script to get more in tune with modern sensibilities in recent years. That’s how we were able to go from the first princess Snow White, who is not only domesticity incarnate but the “fairest of them all,” to the Polynesian princess Moana, who leaves her home not to flee for her life but to save her people as their ruler.
There were many other princesses in between that allowed Disney to showcase Native Americans, Asians, African-Americans, etc., as well as princesses who didn’t just end married and living happily, as in the case of Brave. There are also a whole slew of Disney remakes of its animated films with live actors, which is giving the famed studio a chance to not just update the look of a film but to bring it into the more modern age with respect to the representation of race and sex.
But there still have been quite a few slip ups along the way even in the films intended to be a lot more progressive than its predecessors.
The galaxy may be far, far away, but the film was not a long time ago
One striking example of surprisingly persistent sexism in the Disney universe is the 2016 Star Wars film Rogue One. Sure, there’s a female hero at the center of the film rather than a young man trying to find himself and come into his destiny by tapping into the power of the Force. There were even some women fliers among the rebels.
But did you notice something? Every single one of the scientists and engineers appearing in the movie was a man.
The protagonist's father was the chief scientist leading a team of male engineers. Even if Disney was not yet ready to allow our heroine’s mother to be the scientist (so I don’t know why not) the producers could have at least included some women among the engineering group. But they didn't, and that obvious oversight is something that should make the producers at Disney blush.
So if the new AI tool catches what is obvious to anyone who is aware that roles do not have to be gendered and that you don’t get away with assuming all engineers and scientists are men just because you created a strong female lead, that would be a good thing for film.