Alexa and now Iris: Women as the personalities of AI helpers

Photo courtesy of Pexels

How does Artificial Intelligence reinforce harmful gender stereotypes?

There have been two moments in my life when I was happiest. The first was the day I met Josh. And the second, the day I killed him,” Sophie Thatcher said as Iris in the latest hit movie “Companion.”

Artificial Intelligence, or more commonly, AI, has become a household name. There is rarely a day on campus when you do not overhear someone mentioning using ChatGPT for an assignment or 10. We have accepted that we live in a society in which AI is becoming more and more prevalent, now even threatening jobs that have been traditionally done by humans. Despite this, we are steadied by the fact that AI could never replace actual human relationships or interactions. 

Except, now it can. 

“Companion”, starring “Yellowjackets” star Sophie Thatcher, was released in January of this year and created much discourse about the ever-growing role of AI in day-to-day life. The story follows Iris, played by Thatcher, and Josh, played by Jack Quaid, as they retreat to a friend’s cabin for a relaxing weekend. The couple seems nothing but normal at first, but as the weekend goes on, we begin to understand that Iris and Josh’s relationship is far from it. Iris is a “companion bot” created by the company Empathix to fix the loneliness epidemic so many of us are experiencing. Except, Iris and the other companion bots do much more than comfort and spend time with their partners. They can tell the exact weather, they can cook decadent meals, and of course, they have sex. Unlike a human partner, though, they can be turned off right after with a simple “go to sleep” command, as we see done with Iris. 

Obviously this is unsettling, but it is a movie after all, so how accurate could it be? Well, it turns out that Artificial Intelligence is already playing a big role in the sexualization and the commodification of women. The decision to make the default voices for AI helpers a feminine one is not a mere coincidence. In an interview with Business Insider, Daniel Rausch, Amazon’s Vice President,  justified the company’s decision to use a female voice for their helper Alexa. 

"We carried out research and found that a woman's voice is more 'sympathetic' and better received,” Rausch told Business Insider.

While this might be true, it would be ignorant not to acknowledge that having the default voice of these virtual helpers be feminine reinforces harmful stereotypes. 

The United Nations has been concerned with this phenomenon for many years now. Their 2019 publication, “I’d Blush If I Could,” titled after Apple’s default female assistant Siri’s response when told “Hey Siri, you’re a b—-” tackles the unequal presence of women and girls in the tech industry as well as the harmful effects of gendered AI assistants. 

The document explains that by assigning a feminine character to these assistants, it solidifies the idea that women are supposed to be in subservient positions. The nature of how these assistants respond also promotes the idea that women are to be complacent and even appreciative when responding to sexual or expletive comments like the one which inspired the document’s name. 

Brianne Dosch, UT subject librarian for the Applied Artificial Intelligence Curriculum Committee, revealed that in her experience with gendered robots or AI, those in domestic spheres or emotional management took on female voices while those in jobs like cybersecurity or logistics had masculine voices. Dosch also pointed out the dangers of becoming too comfortable with AI. 

“The manner people treat AI can mold how they interact with others. If someone routinely operates a submissive, gendered AI assistant, that interaction style could seep into real-world bonds in delicate ways,” Dosch said. 

She warns that it is crucial that people recognize the effects that Artificial Intelligence could be having on them, good or bad. 

Whether you have noticed it before or this is the first time you are hearing about it, the role of gender in Artificial Intelligence is no coincidence. While the issue may not be as severe as presented in “Companion,” using female voices as the default setting in AI helpers reinforces toxic stereotypes and contributes to the narrative that women are a commodity in the workforce rather than independent beings.