Scitech  Hey Siri, Make Me a Sandwich

Sexism and the Gendering of Virtual Assistants

On your phone, your laptop, your smartwatch, even your father-in-law’s BMW – virtual assistants are everywhere, keeping track of meetings, answering questions, and more. Most, if not all, virtual assistants have women’s names and voices – they are coded feminine by default. Why is this the case?

In the field of artificial intelligence (AI), a virtual assistant is a language recognition system that learns from human demands and actions. Also known as chatbots in some contexts, these programs and softwares go through a process of machine learning* based on interactions, and their scripts and features constantly evolve as people interact with them. They can be found in a number of places: on smartphones, embedded as helpers in messaging apps, representing companies on apps and websites, and even in cars and on appliances. Popular examples include Amazon’s Alexa, Google Home, Microsoft’s Cortana, and Apple’s Siri.

AI virtual assistants can do a number of things for the average person, including setting timers, marking dates in calendars, creating and maintaining grocery lists, and directing calls. In this way, they have begun to embody the role of a robot secretary.

Secretaries fall within the role of “pink collar jobs:” a group of traditionally women’s jobs, often in the service industry, that require less academic training than white-collar roles. Other pink-collar jobs include flight attendants, nurses, hostesses, maids, and nannies. These are jobs reliant on a worker who is submissive, subservient, and often very friendly – traits that are considered feminine and often expected of women. Virtual assistants are developed with this gendering in mind; research has found that people prefer female voices in assistive roles, as opposed to men in authoritative positions.

As subservient women, pink-collar workers are expected to respond calmly to even the rudest of remarks – including sexual harassment. This is embedded within the scripts of a number of virtual assistants. When Alexa is harassed, she goes into “disengage mode,” saying things like “I’m not sure what outcome you expected” or simply beeping, without any words. This was seen as an improvement from past scripts, which included statements like “well, thanks for the feedback.” Siri responds similarly, saying “I’m not going to respond to that.” This is expected of the ideal pink-collar worker – upholding a positive representation of the company, even when one’s mental health or safety is at risk.

This has resulted in people harassing virtual assistants. According to a writer for Cortana, Microsoft’s virtual assistant, “‘a good chunk of the volume of early-on enquiries probe the assistant’s sex life.” It is telling that virtual assistant users, frequently male ones, treat their woman-coded tech as if it is not just a piece of software, but a being with not only a personality, but a capacity for sexual behaviour. Female-sounding voices that do not cater to submissive, passive expectations for human women are criticized, and eventually modified. In 2015, UK grocery store chain Tesco switched their self-checkout voice to a male, as the former female one was deemed too “shouty.”The sexism apparent within the development of virtual assistants is a partial symptom of the underrepresentation of women in AI industries. Women make up only 12 per cent of AI researchers and 6 per cent of software developers. In a space where men are making decisions regarding the scripts of women-coded virtual assistants, it is not surprising that they do not stand up for themselves against sexism. Virtual assistants may be objects, or “just a piece of tech,” but the way that we treat them, and the way that we expect them to respond to that treatment, reflects a larger culture. Here, that culture is one of gendered disrespect.

However, there are organizations and projects working toward changes in the field. Feminist Internet and Comuzi have developed F’xa, a feminist chatbot whose purpose is to teach people about the bias in AI systems and provide advice on how to address it. F’xa, unlike other bots, doesn’t use personal pronouns like “I” – attempting to reinforce that virtual assistants are not people, and to dissuade users from developing humanlike connections with it. This bot is a small part of Feminist Internet’s greater project to “make the internet a more equal space for women and other marginalised groups through creative, critical practice.”

F’xa was, in part, informed by the Feminist Chatbot Design Process, which was created by Josie Swords to guide AI developers in creating better, more equitable technologies. The guide consists of questions to be asked in the conceptual design phase of a bot to address what values will be embedded in it. Projects like these can stop unconsciously sexist ideologies from being written into the technologies we use every day, reminding us that technologies, despite their metallic exterior, come from a fundamentally human place.

*Machine learning: “a branch of artificial intelligence based on the idea that systems can learn from data, identify patterns and make decisions with minimal human intervention.”

If you’re interested in exploring new approaches to artificial intelligence, consider writing for our dedicated column, Alternative Intelligence. Contact scitech@mcgilldaily.com for more information.