Lessons from Apple vs. the F.B.I.

In Apple’s fight with the F.B.I. over iPhone privacy, the two sides appeared to be defending principles that were both compelling and irreconcilable.Photograph by Xinhua / Wang Lei via Getty

It’s welcome news that the Federal Bureau of Investigation has dropped its legal effort to force Apple to help it create a method of accessing data on a locked iPhone 5C used by Syed Rizwan Farook, one of the perpetrators of the massacre that took place in December in San Bernardino. Not that the Bureau, which ultimately found another means of getting into the phone, didn’t have a legitimate interest in knowing what was on the phone: only an ardent libertarian would argue otherwise. But the case raised a number of important issues and conflicting interests that judges alone can’t be, and shouldn’t be, expected to resolve.

Curiously enough, the F.B.I. and Apple agreed on this point, if nothing else. "That tension should not be resolved by corporations that sell stuff for a living,” James Comey, the director of the Bureau, said in a post published in February at the national-security blog Lawfare. “It also should not be resolved by the FBI, which investigates for a living. It should be resolved by the American people deciding how we want to govern ourselves in a world we have never seen before.” In explaining Apple’s decision to appeal a court order that was handed down in February, which required the company to help the F.B.I., Tim Cook told Time magazine, “Somebody should pass a law that makes it clear what the boundaries are. This thing shouldn’t be done court by court by court by court.”

Of course, merely calling for a political solution doesn’t help us to decide what one should look like. If there were a simple legal or technological resolution that satisfied the demands of both sides, it would already have been adopted. The reason the San Bernardino case was so contentious was that, at first glance, the two parties appeared to be defending principles that were both compelling and irreconcilable.

Clearly, after Edward Snowden’s revelations about the extent of U.S. government surveillance of citizens, Americans have ample reason to be concerned about the surveillance opportunities offered by digital technology, and the possibility that big tech companies are complicit in this spying. Modern smartphones contain all sorts of personal information, from saved e-mails to financial records to intimate pictures. Apple, as a leading purveyor of smartphones, has every reason to respond to the privacy concerns of its customers. That’s what it did when it incorporated code in iOS that wipes the hard drive when someone enters an incorrect passcode ten times in a row.

Law-enforcement agencies, in seeking to protect the public, also have a vital job to do. And they have long had the right to violate people’s personal space, with a court’s approval. For example, in searching for incriminating evidence, they can, given a suitably tailored warrant, break down the front door of a person’s home, rip apart walls and floors, and rifle through personal possessions. They can also make landlords assist them in gaining entry.

In the San Bernardino case, the F.B.I. effectively argued (and Sheri Pym, the federal magistrate who handed down the court order, effectively accepted), that a cell phone isn’t much different from an apartment, and that Apple isn’t much different from a landlord. The company offered up a number of legal arguments to the contrary, arguing that it shouldn’t be compelled to write new code that would override the security features it had designed into a product. Six weeks of battling it out in court and the media didn’t resolve this central conflict. But it did illuminate some other important aspects of the issues involved in the case.

It now appears as though the F.B.I. seized on the San Bernardino case as an opportunity to pursue a policy agenda that it has had for years, and that it oversold its case. The agency said that it was unable to unlock the iPhone 5C without Apple’s assistance. But as Daniel Kahn Gillmor, a technology fellow at the American Civil Liberties Union, pointed out in a blog post published on March 7th, this claim didn’t ring entirely true. In his piece, which included pictures of an iPhone 5C’s circuit board, Gillmor described how investigators could work around the auto-erase feature by removing the device’s NAND flash memory and backing it up, then trying every conceivable four-digit passcode combination. “If the FBI doesn’t have the equipment or expertise to do this, they can hire any one of dozens of data recovery firms that specialize in information extraction from digital devices,” he wrote. It’s not known for certain if the F.B.I. used the method that Gillmor recommended to get into Farook’s phone. But the post suggested that the Bureau hadn’t exhausted all of the technological possibilities for accessing the data. This may damage its credibility if it gets into a similar legal dispute in the future.

There is also reason to question an argument that Comey has been making in conjunction with the case—that strong encryption protocols, which other technology firms are also deploying, are producing a new “dark” zone that terrorists, criminals, and other bad actors can exploit. Undoubtedly, the encryption measures introduced by Apple and other tech firms since the Snowden revelations have made it easier for people to conceal data in locked iPhones, encrypted WhatsApp messages, and other protected spaces. But the authorities still have the capacity to collect enormous amounts of information. In the San Bernardino case, for example, the investigators obtained records from Farook’s employer’s cellular provider, which would have included details of all of the calls he placed on the device, and perhaps his saved messages. Cook told Time that Apple itself gave the F.B.I. “a cloud backup on the phone, and some other metadata.” Law-enforcement officials have said that they wanted to look at Farook’s list of contacts and any other remaining data. Apparently, they were concerned that some recent data might have been missing—it emerged a few weeks ago that Farook may have changed his password, turning off automated iCloud backups in the process.

Apple, and the companies and organizations that submitted amicus briefs in support of Apple’s position, argued that it was impractical and risky to try and create a pass-through on a one-off basis. This sounds like a strong argument, but it needs to be explored further. Were Apple and its allies saying that they can’t be trusted to keep their own security protocols safe? Or were they arguing that it is impossible to design an encryption protocol that can be breached by its creator, but no one else? In pledging to fight the court order, Apple used the first argument, saying, “The only way to guarantee that such a powerful tool isn’t abused and doesn’t fall into the wrong hands is to never create it.” Earlier this month, in an open letter to President Obama, the Electronic Frontier Foundation, which is supporting Apple, appeared to be invoking the second argument. “You can’t build a backdoor into our digital devices that only good guys can use. Just like you can’t put a key under a doormat that only the FBI will ever find,” the letter read.

Some experts found Apple’s position that it was acting in order to protect privacy rights to be less than convincing. In a post at Lawfare, Susan Hennessey and Benjamin Wittes, two scholars at the Brookings Institution, described the company’s self-presentation as “largely self-congratulatory nonsense.” Hitherto, Hennessey and Wittes noted, Apple had strongly opposed legislation that might have clarified laws related to encryption. In now arguing that the existing law couldn’t compel it to help the government, the firm was adopting a “near-duplicitous posture” and “trying to carve out a zone of impunity for itself that rightly alarms the government and should alarm the very citizens the company (which calls these citizens ‘customers’) purports to represent.”

With the case dropped, what will happen now? One option would be for the President and Congress to take up a suggestion Apple has made to “form a commission or other panel of experts on intelligence, technology, and civil liberties to discuss the implications for law enforcement, national security, privacy, and personal freedoms.” Ordinarily, there are good reasons to be skeptical of commissions, which are sometimes used to placate the public while, in fact, serving to delay necessary action and preserve the status quo. In this case, though, a public airing of the issues, some of which are technical and complex, could be productive, especially if the commission’s remit was extended to include other companies and their products, and the broader issue of privacy in the electronic age.

Ever since the early nineteen-nineties, when the Internet was just being widely adopted, the F.B.I. and the National Security Agency have been arguing that the communications world is “going dark” and depriving them of access to information they needed to safeguard the public. The revelations from Snowden and others demonstrated that, in reality, we live in what Peter Swire, a professor of law and ethics at the Georgia Institute of Technology, has called “a golden age of surveillance.” In a recent report published by Havard’s Berkman Center for Internet and Society, a team of experts pointed out that some powerful trends will continue to “facilitate government access” to personal information. The business models of firms like Facebook and Google depend on their ability to track user data. New cloud services create yet more unencrypted data. And the Internet of Things, which will deploy countless devices, in all sorts of places, “promises a new frontier for networking objects, machines, and environments in ways that we are just beginning to understand.”

Even in such a data-rich environment, however, the rise of strong encryption is having an impact and creating some hidden areas. There will certainly be instances when legal authorities want access to encrypted information that they can’t get at. Terrorism investigations aren’t the only example. Absent methods of accessing systems protected by strong encryption, Obama asked a few weeks ago, “What mechanisms do we have to even do things like tax enforcement? If you can’t crack that at all, if government can’t get in, then everyone’s walking around with a Swiss bank account in their pocket, right?”

At this stage, that specific threat may not be too grave. Tax authorities have sweeping powers to demand bank accounts and other financial records. But as encrypted blockchain technologies develop, and perhaps start to replace regular money, they could create more opportunities for concealment. Regardless, Obama was surely right when he said that the time to confront these issues is now. If we wait until after the next big terrorist attack, we could end up with a second Patriot Act.