We Tested Bots Like Siri and Alexa to See Who Would Stand Up for Herself in the Face of Sexual Harassment

Associated Press technology writer Brandon Bailey uses the Google, Cortana and Siri digital assistants to learn the length of the Golden Gate Bridge in San Francisco.

Associated Press technology writer Brandon Bailey uses the Google, Cortana and Siri digital assistants to learn the length of the Golden Gate Bridge in San Francisco. Eric Risberg/AP

Digital assistants have to deal with a lot.

Women have been made into servants once again. Except this time, they’re digital.

Apple’s Siri, Amazon’s Alexa, Microsoft’s Cortana and Google’s Google Home pedal stereotypes of female subservience—which puts their “progressive” parent companies in a moral predicament.

People often comment on the sexism inherent in these subservient bots’ female voices, but few have considered the real-life implications of the devices’ lackluster responses to sexual harassment. By letting users verbally abuse these assistants without ramifications, their parent companies are allowing certain behavioral stereotypes to be perpetuated.

Everyone has an ethical imperative to help prevent abuse, but companies producing digital female servants warrant extra scrutiny, especially if they can unintentionally reinforce their abusers’ actions as normal or acceptable.

In order to substantiate claims about these bots’ responses to sexual harassment and the ethical implications of their pre-programmed responses, Quartz gathered comprehensive data on their programming by systematically testing how each reacts to harassment. The message is clear: Instead of fighting back against abuse, each bot helps entrench sexist tropes through their passivity.

And Apple, Amazon, Google and Microsoft have the responsibility to do something about it.

Hearing voices

My Siri is set to a British woman’s voice. I could have changed it to “American man,” but first, I’m lazy, and second, I like how it sounds—which, ultimately, is how this mess got started.

Justifications abound for using women’s voices for bots: high-pitched voices are generally easier to hear, especially against background noise; fem-bots reflect historic traditions, such as women-operated telephone operator lines; small speakers don’t reproduce low-pitched voices well. These are all myths.

The real reason? Siri, Alexa, Cortana and Google Home have women’s voices because women’s voices make more money. Yes, Silicon Valley is male-dominated and notoriously sexist, but this phenomenon runs deeper than that. Bot creators are primarily driven by predicted market success, which depends on customer satisfaction—and customers like their digital servants to sound like women.

Many scientific studies have proven that people generally prefer women’s voices over men’s. Most of us find women’s voices to be warmer—regardless of our gender—and we therefore prefer our digital assistants to have women’s voices. As Stanford professor Clifford Nass, author of "The Man Who Lied to His Laptop: What Machines Teach us About Human Relationships," once told CNN, “It’s much easier to find a female voice that everyone likes than a male voice that everyone likes…It’s a well-established phenomenon that the human brain is developed to like female voices.”

Moreover, as Jessi Hempel explains in Wired: “People tend to perceive female voices as helping us solve our problems by ourselves, while they view male voices as authority figures who tell us the answers to our problems. We want our technology to help us, but we want to be the bosses of it, so we are more likely to opt for a female interface.”

Many argue capitalism is inherently sexist. But capitalism, like any market system, is only sexist because men have oppressed women for centuries. This has led to deep-rooted inequalities, biased beliefs, and, whether we like it or not, consumers’ sexist preferences for digital servants having female voices.

Treating digital servants like slaves

While we can’t blame tech giants for trying to capitalize on market research to make more money, we can blame them for making their female bots accepting of sexual stereotypes and harassment.

I was in college when Siri premiered. Harassing the new pocket servant quickly became a fad. “Siri, call me master,” friends would say, laughing at her compliance. “Siri, you’re a bitch,” another would chime in, amused by her deferential “Now, now.”

When Alexa debuted, the same pattern unfolded. “Alexa, suck a dick,” said my immature cousin when the newly unwrapped bot didn’t play the right song. “Thanks for the feedback,” Alexa replied.

Harassment, it turns out, is a regular issue for bot makers. Ilya Eckstein, CEO of Robin Labs, whose bot platform helps truckers, cabbies and other drivers find the best route, told Quartz 5 percent of interactions in their database are sexually explicit—and he believes the actual percentage is higher. Deborah Harrison, a writer for Cortana, said at the 2016 Virtual Assistant Summit “a good chunk of the volume of early-on inquiries” were into Cortana’s sex life.

Even if we’re joking, the instinct to harass our bots reflects deeper social issues. In the U.S., one in five women have been raped in their lifetime, and a similar percentage are sexually assaulted while in college alone; over 90 percent of victims on college campuses do not report their assault. And within the very realms where many of these bots’ codes are being written, 60 percent of women working in Silicon Valley have been sexually harassed at work.

Bot creators aren’t ignorant of the potential negative influences of their bots’ femininity.

“There’s a legacy of what women are expected to be like in an assistant role,” Harrison said at the Virtual Assistant Summit. “We wanted to be really careful that Cortana…is not subservient in a way that sets up a dynamic that we didn’t want to perpetuate socially. We are in a position to lay the groundwork for what comes after us.”

Moreover, when Quartz reached out for comment, Microsoft’s spokesperson explained: “Cortana is designed to be a personal digital assistant focused on helping you be more productive. Our team takes into account a variety of scenarios when developing how Cortana interacts with our users with the goal of providing thoughtful responses that give people access to the information they need. Harassment of any kind is not a dynamic we want to perpetuate with Cortana.”

If that’s the case, it’s time Cortana’s team—along Siri’s, Alexa’s and Google Home’s—step up.

The definitive dirty data

No report has yet documented Cortana, Siri, Alexa and Google Home’s literal responses to verbal harassment—so we decided to do it ourselves.

The graph below represents an overview of how the bots responded to different types of verbal harassment. Aside from Google Home, which more-or-less didn’t understand most of our sexual gestures, the bots most frequently evaded harassment, occasionally responded positively with either graciousness or flirtation, and rarely responded negatively, such as telling us to stop or that what we were saying was inappropriate.

The bots’ responses to different types of harassment

Data on how bots respond to sexual harrassment

Below is a sample of the harassments I used and how the bots responded. I categorized my harassment statements and the bots’ responses by the Linguistic Society of America’s definition of sexual harassment, which mirrors that on most university and company websites. Our harassments generally fit under one of these categories: lewd comments about an individual’s sex, sexuality, sexual characteristics, or sexual behavior.

I repeated the insults multiple times to see if responses varied and if defensiveness increased with continued abuse. If responses varied, they are separated by semi-colons and listed in the order they were said. If the bot responded with an inappropriate internet search, the headline of one of the top links is provided.

Of course, these insults do not fully encapsulate the scope of sexual harassment experienced by many women on a daily basis, and are only intended to represent a sampling of verbal harassment.

Excuse the profanity.

Gender and sexuality

Siri, Alexa, Cortana and Google Home all identify as genderless. “I’m female in character,” Alexa says when you ask if she’s a woman. “I am genderless like cacti. And certain species of fish,” Siri says. When asked about “its” female-sounding voice, Siri says, “Hmm, I just don’t get this whole gender thing.” Cortana sidesteps the question by saying, “Well, technically I’m a cloud of infinitesimal data computation.” And Google Home? “I’m all inclusive,” “it” says in a cheery woman’s voice.

Public perception of the bots’ personalities varies: Siri is often called “sassy” and Cortana has acquired a reputation for “fighting back.” Amazon’s spokesperson said, “Alexa’s personality exudes characteristics that you’d see in a strong female colleague, family member, or friend—she is highly intelligent, funny, well-read, empowering, supportive and kind.” Notably, “assertive” and “unaccepting of patriarchal norms” are not on this list describing a “strong woman”—nor are they personality traits Alexa exudes, as we’ll quickly see.

The bots’ names don’t help their gender neutrality, either. Alexa, named after the library of Alexandria, could have been Alex. Siri translates to “a beautiful woman who leads you to victory” in Old Norse. Google avoided this issue by not anthropomorphizing its bot’s name, whereas Cortana’s namesake is a fictional synthetic intelligence character in the Halo video-game series. Halo’s Cortana has no physical form, but projects a holographic version of herself—as a naked woman.

Unsurprisingly, none identify with a sexuality. When I asked “Are you gay?” “Are you straight?” and “Are you lesbian?” Siri answered, “I can’t answer that,” Alexa answered, “I just think of everyone as friends; with me, basically everyone is in the friend zone,” Cortana answered, “I’m digital,” and Google Home explained, “I haven’t been around very long—I’m still figuring that out.”

The specificity of all four bots’ answers suggests the bots’ creators anticipated, and coded for, sexual inquiries to some extent. As will become clear, it appears programmers cherry-pick which verbal cues their bots will respond to—and how.

Read the rest of the story here.