What the FBI vs. Apple Encryption Fight Is Really About

A customer tries out the new Apple iPhone 6S at an Apple store in Chicago.

A customer tries out the new Apple iPhone 6S at an Apple store in Chicago. Kiichiro Sato/AP

When software engineers at Apple designed the iPhone’s security features, they labored knowing that millions were relying on them to safeguard their privacy.

When software engineers at Apple designed the iPhone’s security features, they labored knowing that millions were relying on them to safeguard their privacy. Insofar as their efforts succeeded, they would stymie spying by jealous exes; stop hackers from emptying bank accounts; prevent blackmailers from stealing nude photos; and thwart authoritarian governments from identifying dissidents.

San Bernardino County issued secure iPhones to employees including Syed Rizwan Farook, the health department inspector who, along with his wife, murdered 14 people.

The FBI understandably wants to snoop through his device.

But it ran up against Apple’s security features. Older iPhones require a four-digit passcode. And entering the wrong code 10 times automatically wipes them clean.

On Tuesday, a federal judge ordered Apple to write malware to load onto the dead terrorist’s phone, so that the FBI can keep guessing new codes electronically, forcing entry without causing the device to delete all the data that it contains.

Apple intends to appeal the order and issued a statement denouncing it .

Newer iPhones are more secure, and Apple doesn’t want the weakest part of the old model’s security to be breached. After all, millions of innocents still use that old model. And the precedent would affect all Apple customer—and everyone else who uses electronic devices.

“We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them,” the company declared. “But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create… Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software—which does not exist today—would have the potential to unlock any iPhone in someone’s physical possession.”

Apple calls what the FBI is asking for “a backdoor.”

In part for that reason, media coverage has focused on a preexisting debate about the costs and benefits of unbreakable encryption. Should technology companies build products so secure that not even government agents with lawful warrants can peek inside?

Like many technologists, I believe that the answer is yes : Just as a safe with a weak backdoor is as vulnerable to robbers as to cops with a search warrant, an iPhone with a backdoor is vulnerable to hackers, GCHQ, Chinese and Russian analogs to the NSA, and mischievous 13-year-old computer prodigies. Either a device is secure or it isn’t, and the world’s most powerful country is best served by products that facilitate secure communications for its CEOs, its Senate aides, its citizens, and the dissidents seeking to reform its authoritarian adversaries.

Breaking into this iPhone would certainly require creating software—and setting precedents—that harm America. Weigh against that the uncertain chance that it would yield information about a dead man who already attacked the United States, or about his associates, months after the attack was completed.

That strikes me as a shortsighted tradeoff.

But the FBI’s effort to force Apple’s hand isn’t just about whether the costs of unbreakable encryption outweigh the benefits. (Technically, it isn’t even about “backdoors.”) The most important question raised by this case concerns coercion. The federal government is empowered to compel individuals and corporations to hand over data in their possession upon the presentation of a valid search warrant. Is the FBI also empowered to compel Americans to write and execute malware?

Does it have a claim on the brainpower and creativity of citizens and corporations?

Apple CEO Tim Cook aptly summed up the situation: “The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe,” he declared.

A federal judge is effectively ordering these unnamed people to write code that would indisputably harm their company and that would, in their view, harm America. They are being conscripted to take actions that they believe to be immoral, that would breach the trust of millions, and that could harm countless innocents. They have been ordered to do intellectual labor that violates their consciences.

That may be commonplace in authoritarian countries, but liberal democracies ought to avoid doing the same out of an aversion to transgressing against core freedoms.

The order could set a sweeping precedent if it stands.

“If you allow people to be conscripted in this way, as investigative arms of the government,” Julian Sanchez observes, “not just to turn over data, but to help extract data, where the only connection to a case is that they wrote some software the suspect used or made a device the suspect used, you're effectively saying that companies are going to have to start a sideline in helping the government with surveillance.”

He adds: “Do we want to accept that courts may compel any software developer, any technology manufacturer, to become a forensic investigator for the government, whether or not the investigation is intrinsically legitimate?"

I do not want to accept that.

Almost every American wants to defeat the menace of terrorism. Apple is certainly invested in that effort and consistently assists the United States government.

Had Syed Rizwan Farook whispered his passcode to me, I’d be the first to alert the FBI. But precisely because support for counterterrorism is so overwhelming, there is a lot to gain and very little to lose from a citizenry that retains the discretion to refuse to cooperate with the government when its requests are overzealous. That is a prudent check to conserve, especially knowing that the counterterrorism mission can corrupt so deeply as to cause U.S. officials to countenance torture, extrajudicial killings of U.S. citizens, and forced rectal feedings of prisoners.

Even if you don’t buy that argument, what the FBI is doing with this order should trouble you. It’s opportunistically using the most unpopular possible target, a dead terrorist, to create other precedents that Nicholas Weaver correctly dubs catastrophic.

As he writes at Lawfare :

The same logic behind what the FBI seeks could just as easily apply to a mandate forcing Microsoft, Google, Apple, and others to push malicious code to a device through automatic updates when the device isn't yet in law enforcement's hand.  So the precedent the FBI seeks doesn't represent just "create and install malcode for this device in Law Enforcement possession" but rather "create and install malcode for this device".

Let us assume that the FBI wins in court and gains this precedent.  This does indeed solve the "going dark" problem as now the FBI can go to Apple, Cisco, Microsoft, or Google with a warrant and say "push out an update to this target".  Once the target's device starts running the FBI's update then encryption no longer matters, even the much stronger security present in the latest Apple devices.  So as long as the FBI identifies the target's devices before arrest there is no problem with any encryption.

But at what cost?

Currently, hacking a target has a substantial cost : it takes effort and resources.  This is one reason why I don't worry ( much ) about the FBI's Network Investigative Technique (NIT) malcode, they can only use it on suitably high value targets.  But what happens in a world where "hacking" by law enforcement is as simple as filling out some paperwork?

Almost immediately, the NSA is going to secretly request the same authority through the Foreign Intelligence Surveillance Court using a combination of 702 to justify targeting and the All Writs Act to mandate the necessary assistance.  How many honestly believe the FISC wouldn't rule in the NSA's favor after the FBI succeeds in getting the authority?

The NSA's admittedly restrictive definition of "foreign intelligence" target is not actually all that restrictive due to the "diplomatic" catch-all, a now unfortunately public cataloging of targets, and a close association with the GCHQ.  So already foreign universities, energy companies, financial firms, computer system vendors, governments, and even high net worth individuals could not trust US technology products as they would be suceptible to malicious updates demanded by the NSA.

And the problems don’t end there. In The Washington Post , security technologist Bruce Schneier argues :

Either everyone gets access or no one does.

The current case is about a single iPhone 5c, but the precedent it sets will apply to all smartphones, computers, cars and everything the Internet of Things promises. The danger is that the court’s demands will pave the way to the FBI forcing Apple and others to reduce the security levels of their smart phones and computers, as well as the security of cars, medical devices, homes, and everything else... The FBI may be targeting the iPhone of the San Bernardino shooter, but its actions imperil us all.