VA’s latest AI inventory includes new suicide, EHR-focused use cases

Kevin Carter/Getty Images

Seventy-two of the AI use cases previously included in the department’s 2024 inventory were listed as retired, meaning that their “development and/or use has since been discontinued.”

The Department of Veterans Affairs is continuing to expand its adoption and exploration of artificial intelligence tools to bolster its operations, including looking at the use of the emerging capabilities to better support retired servicemembers in crisis and enhance its new electronic health record system.

A December 2020 executive order signed under the first Trump administration directed non-intelligence agencies to publicly release inventories of their AI uses. Although the current Trump administration rescinded Biden-era guidance that had expanded out some of these reporting requirements, agencies were still required to publicly release their inventories in December. 

VA’s 2025 AI use case inventory was publicly released at the end of January — consistent with most other federal agencies’ releases as a result of the government shutdown and the ensuing federal holiday.

A Nextgov/FCW review of VA’s most recent documented uses of AI found that the department is increasingly looking at how AI tools can benefit its internal operations and its delivery of veteran health services, even as it has moved to retire almost one-third of the use cases it included in its 2024 inventory.  

VA’s 2025 inventory listed 367 examples of the department looking to onboard AI tools, which was a significant increase over the 227 it reported in 2024. Seventy-two of the latest use cases, however, were marked as retired, which the department said indicated they had been included in its 2024 inventory but that their “development and/or use has since been discontinued.” 

The department’s active uses of AI remained relatively static from 2024 to 2025. The 2025 list noted 138 active uses of AI, and while VA’s 2024 inventory used different phrasing — “operation and maintenance” — to signify active uses, that 2024 list noted that 130 had been deployed.

“The department’s AI Inventory showcases VA’s dedication to transparent and responsible innovation, and enables us to track, evaluate, and optimize our AI systems while ensuring they are safe, secure and responsible,” VA Press Secretary Pete Kasperowicz said in a statement to Nextgov/FCW. “This year’s inventory shows steady growth, stronger governance, and expanding real-world impact, particularly in health care, benefits processing, and operational efficiency.”

VA’s latest inventory included the department’s ongoing use of the Recovery Engagement and Coordination for Health-Veteran Enhanced Treatment — or REACH VET — program, which Nextgov/FCW previously explored in-depth to examine how the VA applies AI capabilities to suicide prevention. The predictive model, which VA launched in April 2017, identifies veterans that are in the top .1% tier of suicide risk in order to facilitate more targeted prevention efforts. VA released a 2.0 REACH VET model last year to include new risk factors, such as military sexual trauma.   

VA already uses AI-based training behind the scenes to help Veteran Crisis Line operators better engage with veterans. In its latest inventory, the department said it is in the pre-deployment phase of “Leveraging Acoustic-Linguistic Analytics and Social Determinants to Enhance Suicide Prevention Efforts in Veterans Crisis Line Interventions” — a use case that VA said would evaluate VCL call data “to identify imminent suicide risk and evaluate crisis intervention effectiveness.”

VA officials previously stressed to Nextgov/FCW that uses of AI to help veterans in crisis are only designed to augment clinician outreach or bolster crisis line training efforts. Researchers and veterans advocates have said that’s how these tools should always be used in mental health interventions.

The focus on maintaining human oversight of patient-facing AI uses was also previously raised by a VA watchdog.

In a report released last month, the department’s Office of the Inspector General also said VA’s use of generative AI tools in clinical settings represents “a potential patient safety risk.” 

The watchdog further warned that the Veterans Health Administration — VA’s healthcare arm — “does not have a formal mechanism to identify, track, or resolve risks associated with generative AI,” and expressed concern “about VHA’s ability to promote and safeguard patient safety without a standardized process for managing AI-related risks.”

In a statement to Nextgov/FCW following the release of OIG’s report, Kasperowicz said "VA clinicians only use AI as a support tool, and decisions about patient care are always made by the appropriate VA staff."

Some of the AI use cases included in the inventory are also geared toward enhancing large-scale modernization projects across the department.

VA is in the process of restarting deployments of its new electronic health record system at 13 medical facilities this year, following a pause on most site rollouts of the new software in April 2023. The moratorium on most EHR system rollouts came after the medical facilities that received the software were beset by a host of challenges that included patient safety concerns, technical outages and usability issues. 

During a summit last September, Dr. Neil Evans — acting program executive director of VA's Electronic Health Record Modernization Integration Office — said the department was focused on successful deployments of the new system before working to onboard innovative capabilities like AI.

“One of the big value propositions of delivering a common, commercial EHR is that we can, in the future, more easily orchestrate and deliver an integrated technological experience, because we will be on the same baseline,” Evans said, noting that this includes the potential use of next-generation technologies.

VA’s updated AI strategy, which was released in October, also said the department’s early uses of AI would help inform future adoption of the capabilities in the new EHR system.

“As AI tools are validated and show worth, they will be incorporated into the EHR and many other information technology platforms through coordination between innovators and the teams managing those systems today,” the strategy said. 

The department’s 2025 inventory included five instances of AI use cases pursued by VA’s Office of Electronic Health Record Modernization, which is working to transition the department to the new system. 

In one instance, VA said it was in the pre-deployment phase of implementing a clinical AI agent into the new EHR system, noting that “administrative tasks, manual documentation, and complex workflows within the Electronic Health Record (EHR) cause lower clinical efficiency and operational effectiveness.”

The department said implementation of the clinical AI agent would allow VA providers to “leverage new technology to decrease time completing documentation and administrative tasks during a Veteran visit.”

In the 2024 inventory, the EHRM office’s documented use of AI was limited to an "initiated" case of a tool — called Machine Algorithm for Report Surveillance, or MARS — “to triage ServiceNow incident tickets to determine if there is a potential patient safety impact.”

That inventory noted that “Data from the federal Electronic Health Record is not used as an input to MARS,” although it said that as more medical facilities deploy the new system, “the number of incident tickets will increase — further impacting the feasibility of manual processes to review incident tickets.”

In the 2025 AI use case inventory, VA listed MARS as retired.