AI, Technology, and Sexism


Photo from Tech Op

Why is Siri female?

Apple’s personal assistant on its devices has a voice that is distinctly female. We order Amazon Echo’s Alexa to perform the tasks that the we want. Microsoft’s Cortana, Google’s S Voice and countless other voice command software in GPS navigation systems, apps and websites are all programmed as female. There’s a reason for this: Studies repeatedly show that people overwhelmingly prefer female voices to male voices when it comes to digital assistants.

Both cultural bias and our dependence on technology influence the way that we often program AI to be female. Traditionally, “helper” roles such as telephone operators, personal assistants, and secretaries have been female or portrayed as female in popular media. Clifford Nass, a researcher from Stanford, writes in his book that people expect help from female voices and characters, while a male voice is considered more authoritative. It’s no wonder, then, that we’re hardwired to expect digital assistants to have female voices, and that companies make the strategic choice to have programmers and scriptwriters characterize them as such.

The movies Her and Ex Machina are also demonstrations of this pattern. Scarlett Johansson voices the personal AI companion, Samantha, in the former, and the protagonist Theodore unwittingly falls in love with her before ultimately learning that she has millions of other representations operating for a multitude of different users. In Ex Machina, we see a self-aware, intelligent, strong AI robot called Ava, created by a tech CEO named Nathan, invoke passionate feelings of both love and lust, influencing and even manipulating Nathan’s employee, Caleb, to meet his death.

The characterizations of these AI women in sci-fi films speak to the fetishization and eroticization of female bots by media as well as by our larger society, creating concerns about constructing a gendered, highly-sexualized, and raced robot body. It sets a precarious precedent for the way we view women and how stereotypes are maintained and perpetuated. When women are sexualized through the robot or AI body, we are presented with the possible dehumanization of female characters.

It’s interesting to consider that gender roles and stereotypes are embedded into technology by decisions made by programmers or higher-ups in tech companies and entertainment studios, to name a few.

Consider the simplest form of machine learning — an internet search. Researchers from Universidade Federal de Minas Gerais in Brazil searched using keywords for “beautiful” and “ugly” women in 59 different countries, then categorized the resulting photos into different racial and age categories. They discovered that “beautiful” women tended to be white and young, while the “ugly” keyword returned images of older, black or Asian women. All searches are determined by algorithms that determine what internet users see in search results; these human-determined algorithms perpetuate racial and gender stereotypes and reinforce biases already present in society.

Another study done by a team of researchers at Google in 2013 parsed vector spaces from a database of 3 million words taken from Google News. What they found was that when examining relationships between words, the database search would return results such as “father : doctor :: mother : x”, then x = nurse. The query “man : computer programmer :: woman : x” would give x = homemaker. Blatant sexism, even on platforms controlled by those considered to be professional journalists, could be demonstrated with a simple algorithm.

But if AI and technology that we view and use in our everyday lives are all determined by algorithms and human decisions, there’s tangible progress that we can make to combat stereotypes and biases. For example, we can apply de-biasing algorithms to the same search engines or vector spaces to return results that promote the opposite of the status quo and offer more equal representation. We can make sure that search results for certain keywords don’t offer skewed outcomes. There’s a lot more work to be done to analyze the societal implications of programming and characterizing AI as female as we continue our rapid developments in AI and machine learning, but much room for positive change.