Report: Invest in AI, but ‘It's Not Magic’
Agencies should avoid moonshot goals, a report suggests.
Artificial intelligence may revolutionize numerous government processes from awarding contracts to fighting crime, but when launching AI projects, agencies should favor realistic targets over moonshot goals, according to a report.
Released Monday by the Partnership for Public Service and IBM, the report offers guidance on expanding federal uses of AI and machine learning. Researchers based their recommendations on interviews with 14 government AI experts detailing how they adapted the technology to address concrete problems.
“Some places that look promising may turn out to be not so great—other places will turn out to be a big win,” Claude Yusti, a federal AI specialist at IBM who contributed to the report, told Nextgov. “The notion of starting small and [growing] incrementally is that you’re probably going to hit a few home runs but you should also be prepared to divest from some [projects].”
Though agencies are still in the early days of rolling out artificial intelligence solutions, such technology has the potential to save significant government resources. Deloitte estimated AI and machine learning could save feds up to 1.2 billion work hours and $41.1 billion every year.
While the potential is huge, researchers warned feds not to underestimate the time and money it takes to get initiatives off the ground. Still, agencies are making a mistake if they’re not at least thinking about how AI could impact their mission.
“Some federal organizations remain unaware of the opportunities AI presents, and how they can realize the possibilities of this growing field—a lack of understanding that is increasingly likely to put them at a major disadvantage,” the report states. “It is vital for government to make a strategic investment in understanding how to maximize AI’s benefits and use it to improve agencies and government as a whole.”
One major use case researchers examined was an Air Force initiative to use federal acquisition data to help officers determine which contract vehicles are best for specific products and structure them accordingly. The system would also help contractors better understand federal regulations and application processes without bringing in lawyers.
Feds have also used artificial intelligence to cut down on the time spent performing tedious tasks. The Bureau of Labor Statistics, for example, relies on artificial intelligence to help employees sift through hundreds of thousands of public surveys on workplace injuries.
Among the other cases researchers examined was an effort by Johnson County, Kansas, to direct government resources to people at risk of getting arrested and a project launched by the University of Southern California that helps predict where elephant poachers in Africa will set traps.
Though artificial intelligence promises to transform many of the ways government does business, “it’s not magic,” said Yusti. Not every initiative will pan out, so “you need to place your bet carefully,” he said, beginning with modest efforts instead of going all in on a big, expensive projects.
The report also highlighted the importance of employee expertise in expanding agencies’ use of artificial intelligence. Though the government has increasingly struggled to attract top tech talent to its workforce, Yusti said that if agencies effectively collaborate and compact efforts on AI initiatives, they may already have everyone they need to get projects up and running.
“There’s some amount of skills you need,” he said, “but...none of these agencies said ‘first, we’ve got to hire 10 or 15 people before we could get our pilot started.’ They made do with the resources they had.”