A report shows the airport security agency is good at building and deploying tech but doesn’t have a process to ensure those tools keep operating at a high level.
The Transportation Security Administration has a strong, results-oriented process for building, buying and deploying the screening technologies that keep air travel safe but does not have any procedures in place to make sure those tools continue to work after daily wear and tear, according to a government watchdog.
TSA’s technology deployment method begins and ends with the level of safety the agency is trying to achieve, according to a report from the Government Accountability Office. The agency sets detection standards for the minimum amount of a prohibited item or substance that a technology should be able to detect and the maximum failure rate allowable. Once the standards are agreed upon, TSA moves forward with buying or developing the technology required to meet those standards.
This process has created a stream of robust technologies being deployed at the nation’s airports. However, once those technologies are installed, TSA does not go back to ensure they meet the same high standards under which they were developed.
“According to agency officials, the agency uses certification to confirm that technologies meet detection requirements before they are deployed to airports, and calibration to confirm that technologies are at least minimally operational while in use at airports,” the report reads.
While GAO lauded TSA’s certification and testing process, the oversight agency noted those processes only confirm detection standards are met at a point in time.
“Certification does not ensure that deployed technologies continue to meet detection requirements because it does not account for the possibility that performance of technologies can degrade over time throughout the technologies’ lifecycles after deployment,” auditors wrote.
Auditors cited an example in 2015 and 2016 in which TSA did test deployed technologies designed to detect trace amounts of explosives. The tests showed the tech no longer met the minimum requirements for detection or acceptable failure rates. TSA officials told GAO the systems degraded over time and were not properly maintained.
After those tests, TSA beefed up its preventative maintenance scheduled. But that doesn’t go far enough, according to GAO.
“Because TSA does not test the units after they are deployed to airports, it cannot determine the extent to which these controls ensure technologies continue to meet detection requirements,” the report states. And while TSA does daily calibrations of its equipment, GAO auditors said this only certifies that “the screening technology is at least minimally operational” but “is not designed to test whether the screening technology meets detection requirements.”
TSA officials told GAO they have a plan to correct this issue. The agency is currently developing a process to review explosive detection technology annually to ensure it still meets the requirements.
However, the agency has yet to begin working on a similar process for passenger and carry-on screening, citing a lack of funding for that effort.
GAO also docked the agency for not fully documenting its risk decision process.
“TSA’s deployment decisions are generally based on logistical factors and it is unclear how risk is considered when determining where and in what order technologies are deployed because TSA did not document its decisions,” auditors said. “TSA considers risks across the civil aviation system when making acquisition decisions. However, TSA did not document the extent risk played a role in deployment, and could not fully explain how risk analyses contributed to those decisions. Moving forward, increased transparency about TSA’s decisions would better ensure that deployment of technologies matches potential risks.”