ICE Outlines How Investigators Rely on Third-Party Facial Recognition Services

Stanislaw Mikulski/Shutterstock.com

In a newly released impact assessment, the immigration enforcement agency details its use of facial biometrics and the privacy implications involved.

Law enforcement agencies at every level are incorporating facial recognition technology into investigations work, and Immigration and Customs Enforcement is no exception. However, rather than build its own database and biometric apps, the agency is opting to use third-party services from the private sector, state and local law enforcement and other federal agencies.

All of the facial recognition services used by ICE’s Homeland Security Investigations, or HSI, division are conducted entirely by third parties, either other government agencies or private sector vendors. According to a privacy impact assessment issued May 13 and released publicly last week, HSI agents either send photos to the facial recognition service through encrypted email or upload through a third-party website. At no point does HSI manage any facial recognition software or image databases.

“In the course of its investigations, HSI routinely encounters digital images of potential victims or individuals suspected of crimes but cannot connect those images to identifiable information through existing investigative means and methods,” the document states. “HSI, therefore, submits those images to government agencies and commercial vendors to compare against their digital image galleries via facial recognition processes.”

These third-party agencies and vendors—known as a facial recognition service, or FRS—manage either large databases of images or facial biometric comparison software, or both.

“Those agencies and vendors accept facial images from third parties, including HSI, to run comparative queries of its own image galleries using its own facial recognition algorithm,” the impact assessment states.

The list of facial recognition services in the impact assessment “is not exhaustive,” the document states, “but will be updated in a PIA update if HSI uses an FRS of a significantly different type.”

The types of service providers include:

  • State and Local Law Enforcement: Many law enforcement agencies maintain large databases of biometric information, and some also connect to state departments of motor vehicles to enlarge those datasets. “HSI would only submit probe photos to a state or local [law enforcement agencies] if the agent had reason to believe the subject of the photo lived, visited, or had some other connection to that geographic location,” the PIA states.
  • Regional and Subject Matter-Specific Intelligence Fusion Centers: When criminal activity crosses state or national lines, HSI agents have the ability to tap into intelligence sharing centers, better known as fusion centers. These centers can be focused on regional areas or be subject specific, such as a focus on transnational gangs. Fusion centers also have additional capabilities, such as advanced data analytics and visualizations. The PIA notes some fusion centers are currently working on the ability to query their large databases directly using facial recognition technology.
  • Federal Agencies: HSI agents have access to several federal databases, including DHS’s Automated Biometric Identification System, or IDENT, which is currently on track to be replaced by the cloud-based Homeland Advanced Recognition Technology, or HART, system. ICE investigators can also access the State Department’s Consular Consolidated Database to check images against passport and other travel document photos; the FBI’s Next Generation Identification System, or NGI, which stores photos on more than 38 million convicted criminals; and the Defense Department’s Automated Biometric Identity System, or ABIS, which is used in support of military operations and might soon be connected directly to the IDENT/HART system, according to the PIA.
  • Commercial Vendors: ICE’s facial recognition services program also taps industry offerors, primarily for open-source collections of publicly available images. Some vendors have also developed facial recognition software that HSI agents can use. In such cases, after an agent uploads an image to the application, the vendor is required to “delete the image immediately upon creation of a face template.” The PIA notes that, “While HSI cannot directly control the means or methods of a vendor’s data collection efforts, if HSI discovers that an FRS violates the privacy settings of an open-source system, HSI will discontinue using that vendor’s FRS.”

“ICE HSI primarily uses this law enforcement tool to identify victims of child exploitation and human trafficking, subjects engaged in the online and sexual exploitation of children, subjects engaged in financial fraud schemes, identity and benefit fraud, and those identified as members of transnational criminal organizations,” according to the document.

HSI agents collect images throughout the normal course of investigations, including “mug shots, surveillance photos, social media posts and images confiscated from phones or other data devices,” the impact assessment states, adding that agents can also take still shots from video recordings and streams.

However, in order for an image to qualify for facial recognition, it “must be directly relevant to an investigation and … only submitted to an FRS to further an active investigation.” Agents have also been instructed to use facial recognition as a last resort.

“Prior to submitting an image to an FRS, the HSI agent assigned to the case must first make reasonable efforts to identify the individual through … government database queries, open-source research, and other conventional investigative techniques based on biographical and other nonbiometric information,” according to the impact assessment. “The agent’s use of existing processes must be noted in the ICE Investigative Case Management System as a Report of Investigation.”

Whereas some law enforcement agencies require agents to submit additional data—such as information on the suspected crime or statutory authority for using the service—the PIA notes HSI agents are only instructed to submit the relevant photo and the case agent’s identification information.

“For purposes of individual privacy and investigative case integrity, HSI will refrain from submitting more data than needed,” the document states.

Similarly, the agency has instructed HSI agents to use the results of such services only as a means of gathering leads and not as the linchpin of an investigation. The privacy assessment compares its use to the HSI tip line, which generates 15,000 calls a month for agents to follow up on but are not used as the sole basis for arrest.

“HSI does not take enforcement action against any individual solely based on candidate images,” the document states. “Rather, HSI uses these candidate images as leads, which always requires further corroboration and investigation.”

But the PIA also outlines many of the problems with facial recognition technology, including the need for pristine conditions, high-quality cameras and, most importantly, large and diverse sets of training data. Poor lighting, old or outdated comparison images and insufficient training data lead to poor matching rates, which could have serious consequences for people in HSI’s crosshairs.

“An algorithm can be wrong in one of two ways: either guessing that images of two different individuals are the same person—false positive or false match—or guessing that two images of the same individual were not the same person—false negative or false non-match,” the document states. “While false non-match rates lower the efficacy of a facial recognition technology for HSI investigations, an algorithm’s false-match rate has the greatest impact on individual privacy. ICE is working with the DHS Directorate of Science and Technology on establishing an image quality capture standard to ensure consistency in data definition and accuracy for its use of facial recognition services.”

To mitigate some of these issues, HSI agents primarily use one-to-many—or 1:many—queries that produce a list of potential matches, rather than a single one-to-one result that could be a mismatch and lead an agent down the wrong path.

“Some FRSs provide requestors the option of having their candidate lists reviewed by trained biometric face examiners,” the impact assessment notes. “The examiners will review algorithmic candidate returns using analysis of unique facial features called ‘morphological analysis.’ In these instances, the FRS technology will still provide a multiple candidate return, but the facial examiners provide an interim step of manual biometric analysis according to established industry standards and practices.”

The PIA also offers a list of specific privacy risks associated with this framework—such as not providing adequate notice to subjects that their images are being run through facial recognition, or that the services are used outside of the approved scope—and the steps HSI is taking to mitigate those risks.

Other privacy risks—such as the inability for subjects to access or amend inaccuracies in vendor databases and services—have not been mitigated, according to the PIA.