TSA considers vetting travelers by analyzing personal behavior gleaned from diverse commercial sources.
The Transportation Security Administration is considering screening passengers based on behavioral profiles generated by commercial data brokers, according to new contracting notices.
The agency potentially could use such computations to let consenting passengers with positive assessments bypass certain airport security inspections, officials said.
For a moment, set aside fears about TSA peering into your gun-buying habits or pharmacy purchases. Are commercial data aggregations even accurate? No one knows. Not the Federal Trade Commission or the Justice Department. Not the data brokers. And not the people being profiled -- they often aren’t allowed to see their own records.
Private sector analytics, including consumer behavior analyses, could provide valuable information for Homeland Security and law enforcement officials as citizens’ digital footprints expand, judicial system experts and privacy advocates acknowledge. The practice of harvesting and examining extreme volumes of both catalogued and unsorted data, known as big data, already is in use at Justice’s Alcohol, Tobacco and Firearms Bureau to predict gun violence.
But taking action against individuals based on outdated or inaccurate data syntheses could be detrimental to government and society, the experts say.
In TSA’s plan, a private company would aggregate biographic and biometric “non-governmental data elements to generate an assessment of the risk to the aviation transportation system that may be posed by a specific individual,” states a Jan. 10 request for demonstrations.
The technology would provide a “reliable method that effectively identifies known travelers, based on a sound analysis and the application of an algorithm that produces dependable results,” according to the plan.
Fliers and most TSA officials would not know what factors are being weighed. “The specific sources and types of information employed for pre-screening purposes under this initiative may not be publically disclosed,” officials said. The data collected for profiling “will not generally be disclosed to the TSA, but may be reviewed during an audit.”
Few details on data quality requirements are offered: The vendor must use “specific sources of current, accurate, and complete non-governmental data.”
Errors Pose Serious Repercussions
For homeland security, law enforcement and advertising purposes, big data increasingly relies on the same huge range of sources: social media posts, warranty postcards, state permits, local licenses, voter registration records, credit reports, and clickstreams, or Web surfing patterns – just to name a few.
The FTC in December ordered nine data brokers to provide specific information on “whether the company monitors, audits, or evaluates the accuracy of personal data” used to target advertising. Commissioners, however, would not regulate law enforcement uses of this data, since the agency focuses on consumer privacy, FTC officials told Nextgov. Justice and FBI officials were unable to comment on whether they are interested in regulating the integrity of data used by such brokers.
The consequences of relying on corrupted data or algorithms will vary for each situation, privacy advocates say. The level of precision that satisfies advertisers is very different from the amount of exactitude federal authorities need, said Jennifer Granick, director of civil liberties for Stanford University’s Center for Internet and Society. “You can have 15 percent accuracy for advertising” which might be better than other forms of behavioral analyses, “but if you are getting 85 percent of it wrong when you are denying people government benefits or sending out police to interview them, that would be completely wasteful and dangerous,” she said.
A concern is that most registries contain outdated records, some law enforcement experts say. “The biggest problem is they don’t update,” said Paul Wormeli of the Integrated Justice Information Systems Institute, a federally-funded organization. A citizen’s record is not automatically cleaned if charges are later dismissed, a credit report was originally mistyped, or human resources files were mishandled, he said.
But having a single agency regulate data quality, or even a bunch of agencies regulate it, would be nearly impossible and futile, said Jeff Jonas, chief scientist for IBM Entity Analytics, who was not speaking on behalf of his company.
“Like any good market, organizations that aggregate will be competing on accuracy,” he said. “If they allow consumers to see what they have and provide the consumers attribution” for where the data originated, “then consumers will have an opportunity to fix things they care about fixing.”
Granick argued that fixes made in one database should, but don’t always, ripple through any system that the government also taps to make decisions. Today, the private sector is not required to correct consumer data. “There’s no right to access the profile that whatever advertisers of the world have compiled on me. Amazon has a profile on me and what they think I like, and I can refine it, but I can’t get a copy of what they do,” she said.
Marketing companies say new industry guidelines that let Internet users opt-out of online tracking are sufficient. The principles adopted by the Digital Advertising Alliance, which includes databrokers such as Datalogix and Acxiom, also prohibit Web surfing analytics from being used to determine eligibility for employment, credit standing, healthcare treatment and insurance.
“To date it’s proven to work. We have very broad reach and people are following it,” said Stuart Ingis, counsel for the alliance.
Unlike those clickstream purveyors, credit bureaus are mandated by the Fair Credit Reporting Act to resolve errors that could grab the government’s attention, such as transactions signaling money laundering.
All the information is updated every 30 days, or by the end of a payment cycle, said Norm Magnuson, vice president of public affairs for the Consumer Data Industry Association. Citizens communicate name and address changes to their lenders, who in turn furnish those adjustments to the databrokers. “The furnisher may have more up-to-date address information than the post office,” Magnuson said.