Losing Liberty Through the Backdoor
Letting the government bypass iPhone security measures won’t stop terrorists—or make you safer.
The question of how to balance government surveillance with individual privacy is really quite simple. On one side the government believes that the investigation of someone who is either planning or has actually carried out a crime should be without any conditions, that all evidence potentially relating to the event should be accessible to law enforcement. On the other side, citizens have a reasonable expectation of privacy in their day to day activities, meaning that the government should have to demonstrate indisputable “probable cause” to a judge before undertaking any intrusion into an individual’s private space. And even then, the intrusion should be narrowly defined to include only the actual criminal activity under investigation.
The problem comes in where the two principles collide, particularly as the new national security relationship between government and governed is still being laboriously defined in the wake of Edward Snowden’s revelations about the extent of American and British communications monitoring. The manufacturers of telephones and the providers of internet and phone services, which inhabit an uncomfortable space between the government and the public, have inevitably become the new zone of conflict. Apple, maker of the world’s most well known smartphone, has recently found itself in these crosshairs.
Companies such as Apple market hardware and communications services globally based on a presumption that the systems are secure, meaning that they are resistant to being hacked or accessed by either criminals or the government. As a result, security features have been incorporated that are at least in theory unbreakable, some of which are referred to as “end to end encryption” where only the sender and receiver can have access. Sophisticated security systems reportedly have so many variables built into them that they can only be broken by a computer capable of running thousands or even millions of numerical combinations. Such computers exist at NSA but they are unable to defeat a second feature that some phones have, which is a delete function that wipes the phone memory clean of data after 10 tries to break the security system are attempted.
The national security community, for its part, maintains that any communications system must have a “backdoor,” that is a point through which access can be obtained that bypasses or disables the security or passcode and reveals the contents. To further complicate the issue, a federal judge has now entered into the conversation, ordering Apple to “unlock” the iPhone that was used by Syed Rizwan Farook, one of the two terrorists who killed 14 people in San Bernardino California on December 2. The FBI reportedly has been seeking to access the phones for over two months without success and is claiming that Apple has been uncooperative in revealing the technology involved.
Apple’s Chief Executive Tim Cook responded to the demand with a refusal, saying that the “U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create.” Cook went on to state that any attempt to create a backdoor would weaken the overall integrity of the security system, making it susceptible to hacking and other cyber intrusion. And he also correctly noted that many people store substantial information on their phones, meaning that the government or a criminal would have access to personal data far beyond the record of who has been calling whom and when.
All the phone’s peripheral information would become vulnerable and there is no way to guarantee that the government would not access information that has nothing to do with its investigation. Indeed, the past 15 years would suggest that the government cannot be trusted whenever it is presented with an opportunity to overreach. And once a key is developed to compromise the security of even one phone it can be used on all phones that use the operating system, which means any one of the millions of Apple phones.
Cook did not note his other concern—creating a backdoor for the U.S. government would cost Apple much of its huge overseas market, after consumers there turned to other phones with unbreakable encryption. It would be devastating for the company.
The Apple chief also expressed another concern, that bowing to the FBI demand would inevitably lead to new administratively imposed requirements, where the “government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.”
Apple quite likely also understands that any key provided to the FBI would not just be destroyed after use. It would be shared with the CIA and NSA and likely with foreign partners like the British and Israelis, who would not be reluctant to use the new toy. Cook is certainly aware of legislation pending in Great Britain. The British government security services, like the FBI, are particularly concerned over phone encryption and a new law would require the companies involved to “decrypt” desired information when presented with a warrant to do so. It is unclear what would happen next if the company cannot comply because no such technical option exists, as Apple argues, but the British government might well demand that such a feature be incorporated into new operating software. If the company failed to comply, it could be subject to punitive fines or even have its business operation shut down in the UK.
And there is also considerable debate over proposed British government monitoring of the internet, which is likely a harbinger of what might be coming worldwide. Additional legislation being proposed by Home Secretary Theresa May as part of a package of new laws designed to enable the police and security services to have freer access to a whole range of communications services would require technology companies to retain all “internet connection records” for 12 months.
That means that any time you send or receive something or go to a website, the information would be saved and would be accessible to the police. And it would all take place without either judicial oversight or any requirement for a warrant. Interestingly, the legislation is being promoted as a tool to investigate child pornography sites, but it would no doubt be used much more extensively if it becomes law. As in the frequently abused FBI surveillance using National Security Letters, the target of the investigation would have no knowledge that he was being looked at and the communications provider would be forbidden by law from revealing to the customer that anything was taking place.
Nearly everyone would likely agree that revealing the contents of terrorist Syed Farook’s phone would be desirable. But if doing so would also make all Apple smartphones vulnerable to government intrusion, it would be the devil’s own bargain—trading away a fundamental liberty for a tool that the security services would undoubtedly find helpful, though it is unlikely to be a game changer. And as soon as militants learn that some of their phones are vulnerable, they would undoubtedly find other ways to communicate, as they have done in the past.
Government inevitably pushes for more power to define the rules it operates under in such a way as to sanction behavior that once upon a time would have been considered unacceptable. In this case, it is important to understand that the Farook iPhone is not just a single phone owned by a terrorist, as the FBI would have one believe. It instead constitutes a wedge issue, representing the government’s insistence that everyone’s zone of privacy should be defined by some bureaucrat’s interpretation of “making you safe.” Developing an iPhone backdoor to find out whom Farook talked to would be a very bad bargain.
Philip Giraldi, a former CIA officer, is executive director of the Council for the National Interest.