Lawmakers Introduce Bill to Curb Algorithmic Bias

Sen. Cory Booker

Sen. Cory Booker Alex Brandon/AP

The Algorithmic Accountability Act would force companies to check whether their tech is making biased, inaccurate, discriminatory or otherwise unfair decisions.

Lawmakers want to make sure the algorithms companies use to target ads, recruit employees and make other decisions aren’t inherently biased against certain people.

Sens. Ron Wyden, D-Ore., and Cory Booker, D-N.J., on Wednesday introduced legislation that would require organizations to assess the objectivity of their algorithms and correct any issues might unfairly skew their results. As society depends on tech to make increasingly consequential decisions, the Algorithmic Accountability Act aims to create a level playing field for people of all backgrounds.

Rep. Yvette Clarke, D-N.Y., introduced a companion bill in the House.

Under the act, the Federal Trade Commission would compel companies to test both their algorithms and training data for any shortcomings that could lead to biased, inaccurate, discriminatory or otherwise unfair decisions. Groups would also need to ensure they’re protecting the privacy and security of the consumer data being fed into the algorithms.

Companies would be required to address any flaws they uncovered during the assessment.

The legislation would only apply to companies currently regulated by the FTC that earn more than $50 million per year, or organizations that collect data on more than 1 million people.

“Algorithms shouldn’t have an exemption from our anti-discrimination laws,” Clarke said in a statement. “By requiring large companies to not turn a blind eye towards unintended impacts of their automated systems, the Algorithmic Accountability Act ensures 21st-century technologies are tools of empowerment, rather than marginalization, while also bolstering the security and privacy of all consumers.”

Algorithmic bias remains one of the most persistent problems facing the tech community as it works to improve decision-making with data.

The Housing and Urban Development Department recently sued Facebook for letting advertisers target—and exclude—people from seeing real estate ads based on age, race, sex and other protected categories. The American Civil Liberties Union used images of members of Congress to demonstrate racial bias in Amazon’s facial recognition system, which is currently used by the FBI and other law enforcement agencies.