Sens. Ron Wyden and Cory Booker also asked five industry giants how to mitigate discrimination against minority patients.
Lawmakers are calling on two federal agencies and five of America’s health care giants to demonstrate how they are or plan to assess and address the potential bias that exists in algorithms used in health care systems across the nation.
Sens. Ron Wyden, D-Ore., and Cory Booker, D-N.J., penned letters to the Federal Trade Commission, the Centers for Medicare and Medicaid Services, and executives at UnitedHealth Group, Blue Cross Blue Shield, Cigna Corporation, Humana and Aetna Tuesday, demanding more information regarding how they are addressing issues around algorithmic bias and the potentially detrimental impacts it can introduce in health care treatment for minority patients.
“In using algorithms, organizations often attempt to remove human flaws and biases from the process,” the senators wrote in the letters. “Unfortunately, both the people who design these complex systems, and the massive sets of data that are used, have many historical and human biases built in. Without very careful consideration, the algorithms they subsequently create can further perpetuate those very biases.”
Wyden and Booker introduced the Algorithmic Accountability Act earlier this year, which directs companies to identify and rectify flawed computer algorithms that result in unfair, biased or discriminatory decisions, and also calls on the government to help mitigate their effects. But the latest letters were inspired by a study that was recently published in the journal, Science, which revealed racial bias in one algorithm that’s widely used in health systems across America. The results indicate that, because of a racially-biased algorithm, a common software program “severely underestimated the health care needs of black patients.”
“The findings of this study are deeply troubling, particularly taken in the context of other biases, disparities and inequities that plague our health care system,” the senators wrote.
Though the organization that concerns were raised against in the Science study is working to fix its product, Booker and Wyden believe the government also has a weighty responsibility to help resolve related issues around bias across the landscape. In the letter to CMS, they point to ways the agency already employs artificial intelligence and algorithms. They question it and the FTC on how they are working to tackle serious issues around algorithmic-bias and how the government’s present enforcement mechanisms measure up in solving the problems posed.
The senators also gauge whether FTC would commit to launching an investigation into how certain communities of people might be unfairly discriminated against due to such algorithms. And in the letters to industry executives, the lawmakers question the number of algorithms their companies use in automating predictive health care decisions and for insight on all that insiders are doing to help eliminate algorithmic bias across their efforts.
“Congress and companies like yours that play a role in the health and well-being of Americans must make a concerted effort to find out the degree to which this issue is widespread and move quickly to make lasting changes,” they wrote.