Leaders at the Department of Veterans’ Affairs and CISA describe use cases where automation thrives, but still requires human oversight.
The presence of artificial intelligence in the federal workforce is poised to expand, with officials emphasizing the human component behind automation and machine learning technologies.
Officials including Gil Alterovitz, the Veterans’ Affairs National Artificial Intelligence Institute director, and Martin Stanley, the branch chief of Strategic Technology at the Cybersecurity and Infrastructure Security Agency, spoke during a Thursday panel and discussed digitization within their respective agencies.
Alterovitz said that VA leadership has opened up new data scientist positions to serve as subject matter experts across the government.
“We've been working toward building pathways toward developing and assessing that AI knowledge,” he said. “We're working with a number of other agencies and really the idea there is to build that pipeline of talent with AI knowledge both from outside government [and] inside the government so that the result of that would be an agile and responsive federal workforce equipped with the necessary competencies for AI.”
Alterovitz also discussed the ethical parameters the VA has in place for its usage of automated technology. He cited the executive order signed by former President Donald Trump that outlined how government agencies need to responsibly manage AI.
Alterovitz said that this framework set a minimum standard, and VA leadership is looking to move well above the recommendations.
“There's a number of different offices that are integrating in working across the space to ensure that we do trust…and we can enable trustworthy AI for the future and that the current applications and use cases fit that standard as well,” he said.
Some of these AI use cases Alterovitz anticipates seeing in the VA are within veteran healthcare, namely medication and appointment support, as well as tailoring physical therapy exercises with wearable technology.
“AI can really be at the cornerstone of putting together that information that really comes in large quantities and across many patients,” he said.
Stanley added that CISA uses its own framework to implement AI and machine learning into agency operations that encompasses some of the best practices laid out in the executive order. He also noted automation can be an asset in the federal sphere.
“We have…a talent shortage and a very large problem to solve. So wherever we can automate, I think that's probably the most obvious and best application of these technologies,” he said, citing one benefit of automation within CISA as cutting down on response time in the wake of cyber incidents.
He also said that these systems still require human supervision in certain situations.
Stanley noted that these are use cases which have the simplicity to benefit from automated AI. Project managers working in the public sector will need to apply these technologies appropriately, specifically regarding CISA’s immediate responses to cyberattacks.
“This ties into, you know, machine learning in the sense that things where you have lots of examples, you know, predictable outcomes for those examples are things that lend themselves automation well,” he said. “That's the knowledge that our project managers need to have as they're looking to apply particular technologies in particular mission spaces.”