Law enforcement is the leading DHS use case for AI

ICE agents leave a residence after knocking on the door on January 28, 2026 in Minneapolis, Minnesota. Stephen Maturen/Getty Images
Subagencies like ICE and CBP are leveraging biometric technologies like facial recognition extensively, per the new case inventory.
The Department of Homeland Security’s most common application for artificial intelligence in 2025 was in law enforcement operations, according to the agency’s AI use case inventory released Wednesday.
Federal agencies are slated to publish their inventories beginning on Jan. 28, as stipulated by a memorandum from the Office of Management and Budget released in April 2025.
Of a total of 238 listed AI use cases being deployed across the agency, 86 are being employed in a law enforcement capacity, most often at Customs and Border Protection and Immigration and Customs Enforcement, which reported 49 and 29 use cases in this field, respectively. Indeed, those two subagencies collectively accounted for over half of all use cases DHS reported in 2025, with CBP at 83 and ICE at 51. ICE’s total number of use cases nearly doubled from 2024 to 2025.
And though the topic categorization for AI use cases in 2024 was structured differently than the most recent data, 70 of the reported use cases for that year we classified under “law and justice,” with seven of those cases already retired at the time the agency reported them.
For 2025, DHS reported that 35 of the law enforcement inventories are classified as “high impact.” A high impact use case is determined by OMB as an AI whose “output serves as a principal basis for decisions or actions that have a legal, material, binding, or significant effect on rights or safety.”
The purposes behind these AI systems are diverse. The types of systems –– at various levels of deployment –– include a Passport Anomaly Model, an AI Enabled Autonomous Underwater Vehicle, Anomaly Detections, a Smartphone Information Forensics Triage, AI-Enhanced ICE Tip Processing, Open Source and Social Media Analysis, and more.
Biometrics features heavily in DHS AI usage. In a law enforcement capacity, biometric technologies, powered by computer vision tools, fuel many of the law enforcement use cases. Facial recognition systems are a prominent subcategory of biometric AI systems being leveraged in identity verification processes, national security and child endangerment investigations.
DHS reports that several of these facial recognition services, specifically a set of three employed by ICE, help personnel and investigators reduce time “manually searching for images” online to improve the speed and efficacy of a given investigation.
DHS has not shied away from AI acquisition and adoption. The agency launched its internal generative AI chatbot in December 2024 and released its own playbook for federal agencies looking to leverage AI in operations more, noting the importance of including civil rights and cybersecurity experts as stakeholders in the AI governance process.
In February 2025, the Office of the Inspector General noted DHS needed to do more to securely govern its AI systems, underscoring the privacy and civil rights requirements.
The report specifically flagged DHS as being delinquent in its mandatory use case reporting, saying 66 use cases –– including ICE’s facial recognition system –– were initially omitted from mandatory reporting.




