Apple Encryption and the Erosion of Privacy

32667669_MLIn the wake of the terror attack in San Bernardino late last year, the FBI and other government agents set out to uncover as much information about the attackers as possible. After searching the suspects’ home, speaking with the families of the perpetrators, and looking for other clues, authorities had uncovered a great deal of information about the two suspects.

Despite their findings, however, gathering all the desired data proved exceptionally difficult. In February 2016, the FBI announced that it was unable to unlock the iPhone used by one of the shooters. The phone only allowed so many attempts at the passcode before it prevented any additional attempts. The phone’s operating software provided additional security in the form of encryption.

As a result, the FBI requested that Apple Inc. create a “backdoor” to their own operating system, one that would allow the FBI and other law enforcement to undermine the security features of Apple’s system. Apple denied this request, prompting the FBI to seek a court order.

Apple has opposed the FBI’s demands to create the program that would allow law enforcement to access information on their software. The legal battle is ongoing. On February 16, Apple CEO Tim Cook released an online statement to Apple customers, explaining why the company was refusing to comply with the order.

The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand. This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.

Public opinion is mixed on the case. A recent poll by Reuters found that 55 percent of Americans side with Apple. Nearly 30 percent side with the FBI and the rest are unsure.

Those who side with the FBI argue that government should have access to personal information stored on phones and other electronic devices in order to properly investigate crime. The fact that Apple is “protecting a dead terrorist” by refusing to unlock one of its devices is seen by some as helping enemies of the United States.

In responding to their privacy concerns, officials have stated that Apple can write the code for the FBI. The agency will then use the code in this one case, and then Apple can destroy the software if it so chooses. Few people (rightfully so) are confident that any such software would be used only once. They argue that allowing the FBI and other government agencies a backdoor opens a Pandora’s box of privacy issues and creates an obvious target for hackers of all stripes.

This recent controversy surrounding Apple is the latest discussion surrounding government surveillance and the right to privacy. My coauthor Chris Coyne and I have written extensively about the origins of domestic surveillance. In particular, we’ve examined how surveillance tools initially developed to be used solely against foreign enemies were eventually brought home and used en masse against U.S. citizens. There are countless examples of this, going all the way back to the U.S. war in the Philippines in 1898! Things like wiretapping people’s phones or telegraph lines, once considered a gross violation of privacy, became standard operating procedure in all kinds criminal investigations, including standard domestic crimes.

The idea that the government would only use this software in one instance is patently absurd, as is the idea that it would never be used against U.S. citizens in other cases. It’s not difficult to see how this could proliferate and, as my coauthor and I have noted, there are multiple instances of surveillance techniques expanding domestically. Today, the government is asking for access because of a case involving terrorism, but there are other heinous crimes too. Moreover, the push for Apple to unlock iPhones has already expanded far beyond the San Bernardino case. To give but one example, a Manhattan district attorney has asked Apple to break into 175 phones the D.A. cannot currently access.

This leads to a natural series of questions. If Apple can be compelled to provide a backdoor in this case, in what other cases can the same argument be used to force Apple to undermine its own systems? What is the criteria being used? Should Apple be forced to circumvent its own security features for murder investigations? Assault? Stalking? What about tax evasion? Simple drug possession charges? Prostitution?

In the aftermath of terrorist attacks, its a natural reaction to want to uncover as much information as possible. We should be extremely cautious, however, before advocating measures which are fundamentally opposed to individual liberty. Otherwise, those measures to be used only in “special cases,” become standard operating procedure.

Abigail R. Hall is a Research Fellow at the Independent Institute and an Assistant Professor of Economics at the University of Tampa.
Beacon Posts by Abigail R. Hall | Full Biography and Publications
Comments
  • Catalyst
  • MyGovCost.org
  • FDAReview.org
  • OnPower.org
  • elindependent.org