Hacking Tools Get Peer Reviewed, Too

REDPIXEL.PL/Shutterstock.com

A government-led effort paves the way for data extracted from electronic devices to be accepted as evidence in court.

In September 2002, less than a year after Zacarias Moussaoui was indicted by a grand jury for his role in the 9/11 attacks, Moussaoui’s lawyers lodged an official complaint about how the government was handling digital evidence. They questioned the quality of the tools the government had used to extract data from some of the more than 200 hard drives that were submitted as evidence in the case—including one from Moussaoui’s own laptop.

When the government fired back, it leaned on a pair of official documents for backup: two reports produced by the National Institute of Standards and Technology that described the workings of the software tools in detail. The documents showed the tools were the right ones for extracting information from those devices, the government lawyers argued, and that they had a track record of doing so accurately.

It was the first time a NIST report on a digital-forensics tool had been cited in a court of law. That its first appearance was in such a high-profile case was a promising start for NIST’s Computer Forensics Tool Testing project, which had begun about three years prior. Its mission for nearly two decades has been to build a standardized, scientific foundation for evaluating the hardware and software regularly used in digital investigations.

Some of the tools investigators use are commercially available for download online, for relatively cheap or even free; others are a little harder for a regular person to get their hands on. They’re essentially hacking tools: computer programs and gadgets that hook up to a target device and extract their contents for searching and analysis.

“The digital evidence community wanted to make sure that they were doing forensics right,” said Barbara Guttman, who oversees the Software Quality Group at NIST. That community is made up of government agencies—like the Homeland Security Department or the National Institute of Justice, the Justice Department’s research arm—as well as state and local law enforcement agencies, prosecuting and defense attorneys, and private cybersecurity companies.

In addition to setting standards for digital evidence-gathering, the reports help users decide which tool they should use, based on the electronic device they’re looking at and the data they want to extract. They also help software vendors correct bugs in their products.

Today, the CFTT’s decidedly retro webpage—emblazoned with a quote from an episode of Star Trek: The Next Generation—hosts dozens of detailed reports about various forensics tools. Some reports focus on tools that recover deleted files, while others cover “file carving,” a technique that can reassemble files that are missing crucial metadata.

The largest group of reports focuses on acquiring data from mobile devices. Smartphones have become an increasingly valuable source of evidence for law enforcement and prosecutors, because they’re now vast stores of private communication and information—but the sensitive nature of that data has made the government’s attempts to access it increasingly controversial.

“It’s a very fast-moving space, and it’s really important,” Guttman said. “Any case could potentially involve a mobile phone.”

It’s an odd feeling to flip through these public, unredacted government reports, which lay bare the frightful capabilities of commercially available mobile-extraction software. A report published just two weeks ago, for example, describes a tool called MOBILedit Forensic Express, which is made by San Francisco-based Compelson Labs. The tool works on Apple iPhones 6, 6S, and 6S Plus, two versions of Apple’s iPads, as well as several Samsung Galaxy smartphones and tablets. It can extract the following types of information from a mobile device:

… deleted data, call history, contacts, text messages, multimedia messages, files, events, notes, passwords for wifi networks, reminders and application data from apps such as Skype, Dropbox, Evernote, Facebook, WhatsApp, Viber, etc.

The product page for MOBILedit Forensic Express claims the software is capable of cracking passwords and PINs to get into locked phones, but it’s not clear how effective that feature is. Getting into a locked, encrypted smartphone—especially an iPhone—is difficult, and it’s unlikely MOBILedit can bypass every modern smartphone’s security system.

When the FBI tried to break into an iPhone 5C it found at the scene of the 2015 San Bernardino shooting, it initially wasn’t able to access the phone’s data, and asked Apple for help. (Presumably, the FBI would have had access to MOBILedit and other commercial tools.) Apple refused, and the FBI brought a lawsuit against the company—but withdrew it when agents finally found a way in.

Guttman says NIST doesn’t address phone encryption in its testing.

“Encryption is certainly an issue for law enforcement access to phones and other digital media, but that issue is outside of our expertise and the type of work we do, which is focused on software quality and software understanding,” she said.

The NIST report on MOBILedit describes how the tool fared against different combinations of smartphones and mobile operating systems. It found, for example, that the tool only obtained the first 69 characters in particularly long iOS notes. Besides that issue and five others, though, the tool largely behaved as it promised it would on iOS devices, the report says.

“None of the tools are perfect,” Guttman said. “You really need to understand the strengths and limitations of the tools you’re using.”

Unlike some more complex tools, MOBILedit doesn’t require an investigator to open up a smartphone and manipulate its internals directly—the software connects to the target phone with a cord, just like a user might to update his or her device. But law enforcement doesn’t necessarily need to force its way into a phone it’s interested in searching, either by cracking open its case or by brute-forcing its passcode.

In certain cases, officers can just ask—or pressure—the phone’s owner to open it. That’s what happened when Sidd Bikkannavar, a NASA engineer, was stopped by a customs agent on his way back to his native United States from a vacation: The officer just asked Bikkannavar to turn over his PIN, wrote it down and took his smartphone to another room for about half an hour. When the agent returned the phone, he said he’d run “algorithms” to search for threats. It’s possible Bikkannavar’s phone was searched with one of the mobile acquisition tools DHS has tested.

The government’s growing library of forensic tool reports is supplemented by other testers. Graduate students at the Forensic Science Center at Marshall University in West Virginia, for example, do some of the same sorts of testing NIST does. They often work with West Virginia State Police, which runs its own digital forensics lab on campus, to test extraction tools before they’re deployed. They post their results online, just like NIST does, to grow the body of shared knowledge about these tools.

“If we weren’t validating our software and hardware systems, that would come up in court,” said Terry Fenger, the director of Marshall’s Forensic Science Center. “Part of the validation process is to show the courts that the i’s were dotted and t’s crossed.”

A new NIST project called “federated testing” will make it easier for others to pitch in with their own test reports. It’s a free, downloadable disk image that contains all the tools needed to test certain types forensic software, and automatically generate a report. The first report from the project came in recently—from a public defender’s office in Missouri, an indication digital forensics isn’t just the realm of law enforcement.

I asked Fenger if the technical information being made public in these validation reports could help hackers or criminals circumvent them, but he said the validation data probably wouldn’t be of much value to a malicious hacker.

“It’s more or less just the nuts and bolts of how things work,” Fenger said. “Most of the hackers out there are way beyond the level of these validations.”