Even for a company that's trying to produce driverless cars and "solve" mortality, getting employees to overcome their own biases is a challenge.
Self-driving cars, balloons that beam Internet service to previously unconnected citizens below, immortality—these are the farsighted, high-risk pursuits that Google calls its "moonshots." But another one of its wildly ambitious projects isn't classified as such, and falls a lot closer to campus: curbing workplace discrimination. The company, which has roughly two male employees for every female employee, has spent three years making data-based revisions to its hiring and promotion processes.
No company—and certainly no tech company—has figured out how to dissolve the unconscious biases that govern human-resources decisions. And even if Google found a proven fix for its diversity problem, change would still come slowly. “At our rate of hiring, if we wanted to move to 50-50, we'd have to hire only women for something like the next four, five, or six years,” says Laszlo Bock, the senior vice president of people operations at Google. “To have a meaningful change in the numbers and representation is actually going to take a while because it turns out it's illegal to only hire women or only hire African Americans. So it's going more slowly than I'd like, and more slowly than we'd like."
Since 2012, Bock’s division has been studying unconscious bias and experimenting with ways to get employees to reflect on their preconceptions. “Those of us who are raised in a cultural context have the same associations. It doesn’t matter if you’re male or female or in science or in liberal arts,” Brian Welle, the director of people analytics at Google, has said. Drawing heavily on social-science research, including a landmark 2003 study that found that white and black job candidates faced vastly different standards, Google’s team has become convinced that the key is getting people to admit their own biases before making decisions.
With this in mind, Bock's division has made big changes in the company, like removing the requirement that employees nominate themselves for promotion (women were much less likely to self-nominate than men). And it’s made subtle ones too, like naming more conference rooms after female scientists upon noticing that all but one of the 15 conference rooms on a certain floor bore the names of inspirational men.
Meanwhile, Google’s recruiters have been told not to prioritize information on résumés that analytics has shown to be unhelpful in predicting employee performance, including names, addresses, and even alma maters. In 2013, between 20,000 and 25,000 Google employees (a little more than half of the company, at the time) chose to participate in workshops on the subject of bias, and now a similar presentation is given to all incoming employees. (A version of that presentation can be seen here.)
Has any of this paid off? So far, Bock is encouraged by the results, even though non-anecdotal, non-self-reported data is scant. “We have a lot of qualitative information, like illustrations of when things improved,” he says. “What we then do is we look at not just self-reported data about ‘How do I feel?’ but ‘What’s my assessment of the environment?’…There’s early positive signs that we’re making progress on those kinds of questions.” He’s confident enough in Google’s program that he says he hopes to “slim it down into modules or a kit” that could be used by other companies, organizations, and the government. “I’d love to put something out sometime this summer, certainly this year,” he says.
“When firms have worked on this stuff, they traditionally have brought in…bias training and called it a day,” says Joan C. Williams, the director of the Center for WorkLife Law at the University of California Hastings College of the Law. “Doing anything once does not change a culture." But she's relatively impressed by the efforts of Bock’s team. "Google is the company that I see that seems to be really serious about getting analytical about how bias might be playing out, and then putting in place these bias interrupters to interrupt the transmission of bias,” she says.
That said, the effectiveness of the program is hard to gauge from outside the company. Williams says she’s tried to talk to Google about the specifics of its programs, but that the company hasn't been receptive to collaborating with her. She thinks Google’s reluctance might come from fears that making some numbers public might be the equivalent of asking for a lawsuit. “What the metrics show, potentially, is that you have bias, or you had it in the past, which is kind of a scary thing from a legal standpoint. It's not surprising that they're playing this very close to the chest,” she says.
Are there any companies that are being more open about their internal diversity programs? Williams says she's assembled a group of researchers to work on studies that don't simply confirm once more the existence of bias, but examine how it can be defused. Some organizations have agreed to cooperate with the working group, letting researchers study their ranks as they implement experimental programs. "But those names aren't public at this point," Williams says. For now, the revolution is happening behind closed doors.
(Image via l i g h t p o e t/ Shutterstock.com)
NEXT STORY Former FBI Director Talks Cybersecurity