Amazon Shareholders Move to Stop Selling Facial Recognition Tech to Government Agencies


A group of shareholders filed a resolution to halt sales until they can consider the tech’s societal impact.

A group of Amazon shareholders are looking to force a vote at the upcoming annual meeting to prevent the company from selling its facial recognition technology to the government until the company’s board of directors has a chance to look at the societal impacts.

“Shareholders request that the Board of Directors prohibit sales of facial recognition technology to government agencies unless the Board concludes, after an evaluation using independent evidence, that the technology does not cause or contribute to actual or potential violations of civil and human rights,” five shareholder groups wrote in a resolution released publicly Thursday. The resolution was submitted by lead filer Sisters of St. Joseph of Brentwood and four co-filers: The Sisters of St. Francis Charitable Trust (Dubuque), Sisters of St Francis of Philadelphia, Maryknoll Sisters and Azzad Asset Management.

According to organizers at Open MIC—a group that describes its goal as fostering “shareholder engagement at leading tech and media companies”—the shareholders intend to bring the resolution to a vote at the annual meeting this spring.

Open MIC Executive Director Michael Connor told Nextgov one of three things could happen next: Amazon principals could address the shareholders’ concerns, prompting the group to withdraw the resolution; the company could take no action, which would bring the resolution to a vote at the annual meeting; or the company could fight the filing at the Securities and Exchange Commission.

If Amazon goes the latter route, “the proponents will fight that claim and vigorously defend the resolution,” Connor said.

“Sales of Rekognition to government represent considerable risk for the company and investors. That’s why it’s imperative those sales be halted immediately,” Connor wrote in a blog post Thursday. Connor cited reports that Amazon’s Rekognition system incorrectly matched 28 members of Congress to mug shots and had a 5 percent overall error rate among legislators. That error rate is much higher among minority populations, prompting civil rights groups to issue warnings that such technologies will lead to more racial profiling and civil rights abuses.

“Shareholders have little evidence our Company is effectively restricting the use of Rekognition to protect privacy and civil rights,” the resolution states, citing Amazon Web Services Vice President Teresa Carlson’s statements to a reporter that the company is “unwaveringly in support of our law enforcement, defense and intelligence community.”

“We believe it is the wrong approach to impose a ban on promising new technologies because they might be used by bad actors for nefarious purposes in the future,” Matt Wood, general manager of artificial intelligence for Amazon Web Services, told Nextgov in a statement Thursday. “Through responsible use, the benefits have far outweighed the risks.”

Wood cited examples of Rekognition being used to stop human trafficking, inhibiting child exploitation, reuniting missing children with their families, and building educational apps for children. He also said there have been no reported law enforcement abuse of the program.

Facial recognition systems—which use biometric technologies to match faces to stored images in real time—are being used by a number of federal agencies, including Customs and Border Protection, which has several pilots at international airports and border crossings, the FBI and the White House.

Thursday’s shareholder letter follows a similar correspondence sent to Amazon CEO Jeff Bezos Tuesday, signed by 85 “racial justice, faith, and civil, human and immigrants’ rights groups,” according to the American Civil Liberties Union.

“We continue to urge Amazon to heed calls from civil, human and immigrants’ rights groups, academics, lawmakers and its own shareholders, employees and consumers to stop selling facial recognition technology to the government,” said Shankar Narayan, Technology and Liberty Project director for ACLU of Washington.

Despite these efforts, surveys show Americans are generally in favor of using facial biometrics for law enforcement.

Only one-quarter of Americans polled in a December 2018 survey said they would support government regulations limiting the use of facial recognition technologies, with only 18 percent of those polled said they agreed with strict limitations on facial recognition tech if it comes at the expense of public safety.

“People are often suspicious of new technologies, but in this case, they seem to have warmed up to facial recognition technology quite quickly,” said Daniel Castro, director of the Center for Data Innovation, a nonprofit, nonpartisan research institute that conducted the survey. “Perhaps most importantly, Americans have made it clear they do not want regulations that limit the use of facial recognition if it comes at the cost of public safety.”