New Guidance Aims to Plug Peepholes in City Surveillance Systems

littleny/Shutterstock.com

The preliminary guidelines address the personal data of individuals captured by traffic sensors, speed cameras and other Internet-connected government systems.

A Commerce Department standards body has released preliminary guidelines for masking the personal data of individuals captured by traffic sensors, speed cameras and other Internet-connected government systems. 

Coincidentally, the publication came out the day a Spanish researcher demonstrated that any stalker can monitor the driving habits of customers patronizing dozens of European parking lots. An unnamed major provider of parking management systems allegedly has not been implementing typical security settings.

The aim of the National Institute of Standards and Technology guidelines, published Friday, is to nip such mistakes in the bud by instilling "privacy engineering objectives.” A principle dubbed disassociability, or the ability to hide an individual's identity during system processing, gets at the Big Brother concern.

"What I’m really interested in is starting to learn more about technologies that actually try to build privacy-enhancing technologies into them," Naomi Lefkovitz, NIST senior privacy policy adviser and a co-author of the study, said Sunday afternoon. "There’s work being done on studying photo and video images that can do anonymization techniques. Those are the things that are really exciting because, I think, often you get this sort of attitude of despair: 'Oh, we have all these sensors and they are just going to collect all this information and there’s nothing more we can do.'" 

On Friday, a Madrid security analyst halfway across the world in Amsterdam illustrated a “smart city” surveillance nightmare waiting to happen. 

A world map showed the location of parking management systems in Europe and Australia that are insecurely connected to the Internet and therefore accessible by any user. The map was part of a presentation by Jose Antonio Guasch at the Hack in the Box conference in the Netherlands.

"If the client has an account at the parking, and its personal information is registered, it is possible to know his presence, schedules, when enter or leave the parking… If the parking has a system of cameras installed, you can access directly to live streaming or even past recordings," Guasch said in a paper.

The NIST guidelines are intended to engage industry and lay the foundation for privacy safeguards that protect systems at the crossroads of cyberspace and the physical world.

"So our next step would be to begin trying to explore and map some controls so that when an engineer is saying, 'OK, I want my system to have this capability of disassociability, what kind of controls are available? Those kinds of technologies would be what they might try to deploy as a control to manage risks," Lefkovitz said. 

Many of the tools and techniques are still in the experimental phase, she added. 

Over the past year, a bevy of academic studies introduced ways of blinding surveillance system operators to the identities of individuals -- while retaining useful government statistics.

This "identity-insensitive information" is often desired for "visual understanding and data mining," states a paper Temple University and the Air Force Research Lab scientists released last fall.

Their procedure is called GARP – an acronym that stands for determining Gender, Age and Race attributes of facial images, and Preserving these attributes.

The technique uses a computer algorithm that matches a real face with similar faces of the same gender, age and race located in an image gallery. Basically, the artificial intelligence figures out the individual is, for instance, a young, white male and then blends his photo with a young, white male model that "is most similar to the original face and has consistent attributes,” the researchers wrote.

Masking Facial Recognition

Last November, researchers at Guangzhou University in China developed a method for securely transferring biometric data from where it is captured to a powerful, but "untrusted" cloud processor.

Their "outsourced computation” procedure cross-checks an encrypted image of an unknown individual with archived images in a way that masks the real face -- and the matches -- from the server that performs the search.

That month, separately, scientists from France's University of Caen presented a facial recognition privacy technique that swaps subjects' facial features with "face donor” features. The software application borrows models’ eyes, noses, chins and other face components. Faces are “de-identified by substituting their own face components with the donors’ ones, in such a way an automatic face matcher is fooled while the appearance of the generated faces are as close as possible to original faces," the French researchers wrote.

The Privacy Risk Equation

The NIST publication covers more than dummifying smart systems embedded in government infrastructure. It also explores privacy protections for all Internet-facing federal systems, including those connected to cloud providers and ones that munch on big data from multiple sources. 

In practice, this means constructing a system in such a way agencies and the public can rely on consistent confidentiality, and system administrators can manipulate personal information with fine-grained control. 

In addition, the publication provides a mathematical formula -- or “privacy risk equation” -- that helps calculate the likelihood a problem will jeopardize privacy so agencies can allocate time, money and energy accordingly to reduce the chances.

The departments of Transportation and Homeland Security are among the handful of agencies that might benefit from the smart-infrastructure privacy guidelines, Lefkovitz said.

At Transportation, "they do all kinds of things with sensors and vehicles on the road in order to support their mission in helping to improve transportation systems," she said. "Clearly, DHS is going to be involved in various sensor activities for national security, of course." 

Homeland Security's responsibilities wade into wireless-transportation systems. In December 2012, DHS issued a warning to local governments about a traffic-monitoring system that could expose drivers’ travel routines. 

Manufacturer Post Oak Traffic Systems had been using insecure encryption in roadway sensors designed to read data emitted by in-car Bluetooth equipment, such as hands-free cellphone tools. As a result, hackers potentially could eavesdrop on driver data through a “man-in-the-middle attack.” Post Oak quickly developed a patch to fix the security hole, the company said at the time. 

Such situations partly prompted NIST to develop practices for plugging peepholes in the Internet of Things. 

"Many of these issues converge in the particular privacy challenges governments are confronting as they implement 'smart city' technologies, such as managed traffic flow and automated ticketing (i.e. red light and speed cameras) that can collect information about people," the guidelines state. 

The NIST publication is open to public comment until July 13. 

(Image via littleny/ Shutterstock.com)