Is the Field of Artificial Intelligence Sexist?

We need more women working on the real-life C3PO and R2D2.

We need more women working on the real-life C3PO and R2D2. Kevin Wolf/AP

Perhaps the real danger in the field isn’t the threat of robot overlords.

There’s no doubt Stephen Hawking is a smart guy. But the world-famous theoretical physicist recently declared that women leave him stumped.

“Women should remain a mystery,” Hawking wrote in response to a Reddit user’s question about the realm of the unknown that intrigued him most. While Hawking’s remark was meant to be light-hearted, he sounded quite serious discussing the potential dangers of artificial intelligence during Reddit’s online Q&A session:

The real risk with AI isn’t malice but competence. A superintelligent AI will be extremely good at accomplishing its goals, and if those goals aren’t aligned with ours, we’re in trouble.

Hawking’s comments might seem unrelated. But according to some women at the forefront of computer science, together they point to an unsettling truth. Right now, the real danger in the world of artificial intelligence isn’t the threat of robot overlords—it’s a startling lack of diversity.

I spoke with a few current and emerging female leaders in robotics and artificial intelligence about how a preponderance of white men have shaped the fields—and what schools can do to get more women and minorities involved. Here’s what I learned:

  1. Hawking’s offhand remark about women is indicative of the gender stereotypes that continue to flourish in science.

“Our culture perpetuates the idea that women are somehow mysterious and unpredictable and hard to understand,” says Marie desJardins, a professor of computer science at the University of Maryland, Baltimore County whose research focuses on machine learning and intelligent decision-making. “I think that’s nonsense. People are hard to understand.”

But outdated ideas about gender tend to take root in spaces that lack a diversity of perspectives. That’s certainly the case in computer science. Women received just 18% of undergraduate computer-science degrees in 2011, according to the most recent data available from the National Center for Education statistics.

That statistic is even more depressing when you consider that the number of women in computer science has been declining for the past few decades. In 1985, 37% of computer-science degrees went to women.

  1. Fewer women are pursuing careers in artificial intelligence because the field tends to de-emphasize humanistic goals.

Over the course of desJardins’ multi-decade career, she’s noticed that artificial intelligence research has drifted from a focus on how the technology can improve people’s lives.

“I was at a presentation recently where we were talking about the types of goals that people have for their careers,” she says. “There’s a difference between agentic goals, which have to do with your personal goals and your desire to be intellectually challenged, and communal goals, which involve working with other people and solving problems.”

In general, many women are driven by the desire to do work that benefits their communities, desJardins says. Men tend to be more interested in questions about algorithms and mathematical properties. Since men have come to dominate AI, she says, “research has become very narrowly focused on solving technical problems and not on the big questions.”

  1. There may be a link between the homogeneity of AI researchers and public fears about scientists who lose control of superintelligent machines.

From Dr. Frankenstein to Jurassic Park’s misguided geneticists, popular culture is full of stories about scientists so absorbed by their creations that they neglect to consider the consequences for mankind. Today, artificial intelligence and robotics are faced with a more realistic version of the problem. Scientists’ homogeneity may lead them to design intelligent machines without considering the effect on people different from themselves.

Tessa Lau, a computer scientist and cofounder of robotics company Savioke, says she frequently sees designs that neglect to take into account the perspectives of women and other minority groups.

Back when she worked for robotics research lab Willow Garage, she encountered “this enormous robot research prototype called PR2,” Lau recalls. “It weighs hundreds of pounds—it’s much larger than a smaller woman—and it has two big arms. It looks really scary. I didn’t even want one of those things near me if it wasn’t being controlled properly.”

At Savioke, Lau has made it her mission to design robots that are accessible to a diverse range of people. The SaviONE service robot, already in use at five hotels including the Holiday Inn Express in Redwood City and Aloft locations in Cupertino and Silicon Valley, brings towels, snacks and other deliveries straight to guests’ doors. The robots come with a touch screen that’s easy to reach for children and people in wheelchairs, and obligingly perform what Lau calls a “happy dance” if they receive a five-star satisfaction rating from guests.

“For most people, that’s the first time they’ve had a robot come to the door,” Lau says. “We want the experience to be a pleasant one.”

  1. To close the diversity gap, schools need to emphasize the humanistic applications of artificial intelligence.

When Olga Russakovsky came up with the idea for the world’s first artificial intelligence summer camp for girls this year, she knew the key to holding campers’ interest would be to demonstrate how robots can benefit society.

“The way we teach technology and AI, we start from ‘Let us teach you how to code,” says Russakovsky, a postdoctoral research fellow at the Carnegie Mellon Robotics Institute. “So we wind up losing people who are interested in more-high level goals.”

At the Stanford Artificial Intelligence Laboratory Outreach Summer, or SAILOR, 24 high-school students delved into research projects about practical applications of AI. One group learned about how natural language processing can be used to scan tweets during natural disasters, identifying the areas in need of relief. Another section focused on how computer vision can improve hospital sanitation by monitoring doctors’ hand-washing habits, while other campers researched self-driving cars and how computational biology could lead to cures for cancer.

“The camp aims to teach AI from a top-down perspective,” Russakovsky says, “and show kids, ‘here are some of the ways it can really be useful.’”

  1. A number of women scientists are already advancing the range of applications for robotics and artificial intelligence.

Cynthia Breazeal, who leads the personal robots group at MIT’s Media Labs, is the founder of Jibo—a crowd-funded robot for the home that can snap family photos, keep track of to-do lists and upcoming events, tell stories that keep the kids entertained and learn its owners’ preferences in order to become an ever-more indispensable part of their lives.

Also at MIT, aeronautics and astronautics professor Julie Shah has created kinder, gentler factory robots that can learn to cooperate with and adapt to their human colleagues. The director of MIT’s Computer Science and Artificial Intelligence Laboratory, Daniela Rus, recently debuted a silicone robotic hand that can recognize objects by touch—a trait that could bring us one step closer to household robots that can help out in the kitchen.

Both Rus and Stanford Artificial Intelligence Lab director Fei-Fei Li are overseeing research labs funded by Toyota that aim to create “intelligent” cars that can reduce accidents while keeping humans in the driver’s seat. And Martha Pollack, a provost and computer science professor at the University of Michigan, works on robotic assistants for the elderly and people with Alzheimer’s, dementia and other cognitive disabilities.

  1. Robotics and artificial intelligence don’t just need more women—they need more diversity across the board.

That means stepping up efforts to attract people from a wide range of racial and ethnic backgrounds as well as people with disabilities.

“Cultural diversity is big too,” says Heather Knight, a doctoral student at Carnegie Mellon’s Robotics Institute and founder of Marilyn Monrobot Labs in New York City. “Japan thinks robots are great. There’s no Terminator complex about artificial intelligence taking over the world.”

Differences in cultural attitudes can prompt countries to carve out niches in particular areas. The US has a heavy focus on artificial intelligence in the military, Knight says. And Europeans tend to be particularly interested in applications that support people with disabilities and the elderly.

“It’s important that we’re all talking to each other,” Knight says, “so ideas that are in the minority in one place can get fostered somewhere else.”