Are government decisions being made by AI? Lawmakers want to mandate disclosure

lanny19/Getty Images

New legislation would also require agencies to have an appeals process for adverse decisions made using automated systems in areas like healthcare or government benefits.

A bipartisan trio of senators introduced a new bill Thursday they say is meant to ensure that government decisions deemed “critical” — like those related to employment, financial assistance, healthcare or government benefits — that are made with automated systems come with disclosures about the use of said systems and appeal rights. 

The Transparent Automated Governance Act, or TAG Act, introduced by Sens. Gary Peters, D-Mich., Mike Braun, R-Ind., and James Lankford, R-Okla., would require the director of the Office of Management and Budget to release guidance on AI-based decision making in government, although it notes that guidance required by OMB under the 2020 AI in Government Act could also satisfy these requirements. 

The White House is currently working on guidance for agencies around the use of AI in government.

Under the bill, OMB would be tasked with issuing instructions for how agencies should disclose the use of automated systems used to determine or “substantially influence” government determinations. It would also have agencies set up the ability to provide an “alternative review” of a “critical decision” by a person. 

The category of what constitutes a “critical decision” includes government determinations that affect the access to, cost or terms of things like education, employment, utilities, government benefits, financial services, healthcare, housing, immigration services and more.

Agencies would also be directed to track information to “determine whether each automated system and augmented critical decision process … is accurate, reliable and, to the greatest extent practicable, explainable.”

“Artificial intelligence is already transforming how federal agencies are serving the public,” Peters — who chairs the Senate Homeland Security and Governmental Affairs Committee — said in a statement. “This bipartisan bill will ensure taxpayers know when they are interacting with certain federal AI systems and establishes a process for people to get answers about why these systems are making certain decisions.”

A fiscal year 2022 inventory of AI in the federal government did indeed find over 1,100 current use cases of AI, according to federal chief information officer Clare Martorana. Still, some have warned of a leadership and policy vacuum on the use of AI in government. 

Some advocacy groups like the Electronic Privacy Information Center, or EPIC, say that the use of automated decision-making systems in government “is almost entirely unregulated and largely opaque,” and point to risks with bias, transparency and accountability around how the tools work, due process rights and discrimination. 

Braun said in a statement that the government “needs… to ensure that decisions aren’t being made without humans in the driver’s seat.”

“The federal government can and should thoughtfully integrate new technology to help improve customer service for Americans,” said Lankford. “But agencies should be transparent about when, where and how we are interacting with AI to ensure continuous oversight and accountability for how these tools impact Americans.”

The bill is the latest effort from Congress as lawmakers continue to try to respond to rapid advances in AI and increased attention to generative AI, such as Open AI’s ChatGPT. Microsoft opened the tool up to government customers along with other AI tools earlier this week. 

Even as the Senate works to educate itself and outline a larger plan on AI, lawmakers have already introduced proposals ranging from establishing a new federal agency to oversee AI providers and digital platforms to setting up a task force to find policy and legal gaps in the government’s own AI policies in particular. 

Other proposals focus on the workforce. Last year, a bill to require more training for feds on AI was signed into law, and another proposal introduced in May would require training for government leadership.