7 Reasons a Government Backdoor to the iPhone Would Be Catastrophic

You’ve likely caught wind of the fact that the government and Apple are in the midst of an intense legal showdown in what Edward Snowden has called “the most important tech case in a decade.” The battle is over the legality of a court order compelling Apple to write new software — which the company cleverly referred to as GovOS in a court filing today — that disables several security features that the FBI claims are preventing it from accessing the contents of the work phone of one of the shooters in the San Bernardino attack. Apple is resisting the order, and the company’s CEO, Tim Cook, has committed to going all the way up to the Supreme Court if necessary.

Lest there be any doubt, the ACLU is with Apple on this one, as it was in a similar case several months back. The government’s request is not just about this one iPhone — it has far-reaching consequences for every device, for global cybersecurity, and for basic freedoms at home and around the world. Communications security is critical for the functioning of democracy, and the precedent the government is seeking could do terrible and lasting damage.

Here’s why.

1. The precedent would undermine some of the most important developments in digital security over the last few decades.

To bypass the iOS security features that are preventing the FBI from accessing the contents of the phone, Apple would need to cryptographically “sign” a new version of iOS before pushing it out to the phone (in the same manner it does whenever iPhone users update the iOS on their phones). The signing step essentially confirms that Apple vouches for the update.

If Apple is forced to sign the new, security-broken GovOS, it would undermine one of the most important developments in digital security in recent years. These days, all tech companies build automatic updates into their products. This is an excellent way to ensure that security flaws are patched up as quickly as companies can discover them and that all of us continue to use secure devices immune from malicious attackers.

But once the government secures a precedent to force a company to vouch for an update that it knows is actually insecure malware, users will stop trusting automatic updates. After all, how would anyone be able to trust an update from Apple when the public knows that the government might be directing the insertion of vulnerabilities into new software, even when it’s signed by Apple? Vulnerabilities will go unfixed, creating an optimal environment for hackers and spies. At a time when even President Obama has recognized cybersecurity to be one of the most significant economic and national security threats we face today, it makes no sense to undermine one of the best online security mechanisms out there.

2. Foreign governments and cybercriminals would rejoice.

The malware that the government wants Apple to write would certainly be used as a mold to break into other iPhones — indeed, law enforcement is lining up in case Apple loses this case. A government-mandated master key to a locked smartphone would be like candy to foreign governments who want to monitor their citizens, and tech companies — who can currently resist such requests by arguing that they simply do not possess the software required to help — won’t be able to refuse to comply with demands abroad if the U.S. governments gets its way in this case. (Keep in mind, also, that most other countries lack the procedural and substantive protections against searches and seizures that our Constitution guarantees.)

Indeed, a government win in this case would almost surely have a domino effect leading to thousands such requests — not just to Apple, but to all consumer tech companies. If every tech company needs to be ready to write new backdoors into its product, that means the introduction of scores of new vulnerabilities into the world. And the more such backdoors exist, the more malicious actors will focus their efforts on seizing them.

3. The human rights implications are chilling.

Again, it’s not hard to imagine the Chinese government serving Apple with a warrant to hack into the phone of a dissident activist or intellectual. That development would have a devastating impact on democracy and human rights activists and movements worldwide, which depend on secure communications to flourish. Recognizing the importance of encryption to human rights, the U.S. government has spent tens of millions of dollars to equip activists around the world with technologies to allow them to communicate securely. This case could undercut those efforts in one bang of the gavel.

4. With the Internet of Things, the government wouldn’t need your smartphone to spy on you.

If the FBI wins the struggle against Apple, the implications would extend far beyond your phone. The precedent would allow the government to demand backdoor access to any device it thinks might assist it in an investigation. With the proliferation of smart devices that are constantly connected to the Internet, all those warnings about the end of privacy that may have once sounded hyperbolic will have proved prescient. That smart TV, wireless shower speaker, or intelligent oven could be compromised by a manufacturer compelled by the government to monitor you at home.

5. Putting this powerful tool into the hands of law enforcement agencies that have a history of biased policing will compound existing disparities.

We know that there are existing disparities in policing and warrant execution practices. Increased government investigative powers will simply reflect — and likely exacerbate — these disparities. In other words, already overpoliced communities are likely to be the recipients of these new age search warrants, which provide concerning government access to our digital data. 

This is particularly concerning because the government has taken the position that once they have access to your phone, they have the authority to look at everything. So, an investigation into a minor drug crime could result in police sifting through emails, text messages, and everything else stored on the phone. 

6. In a democracy, companies are not conscripted to work for the government against their will.

Forcing a private company to become an investigative agent for the government is an extreme proposition that wouldn’t stop with this one phone. If the government gets its way in the Apple case, it will get a green light to compel tech companies to work on its behalf whenever it wants help coding its way through the defenses of a given device. There’s a big difference between compelling a company to hand over information already in its possession and compelling a company to serve as a spy for the government. If the government prevails in the Apple case, it would make for an unprecedented expansion of government overreach — not just into our data, but into our creative agency.

7. Encryption has been used to communicate for centuries.

Some of American’s highest level security officials, including former NSA Director Michael Hayden and former DHS Secretary Michael Chertoff, have extolled the virtue of encryption in securing our cyberdefenses. They are part of a rich history. As our friends over at EFF have noted, the Founding Fathers of the United States were big fans of encryption, which they recognized was critical to prevent their communications from falling into the wrong hands. Not only did the Founding Fathers use encryption, but they actually developed encryption tools after America was independent — to protect their communications from the government they helped to start.

The bottom line is that for the sake of privacy, data security, and democracy — we should be focused on strengthening our digital defenses, not weakening them. That’s far more important than the data on any one phone. 

View comments (33)
Read the Terms of Use


I’m just taking up the legal aspects here, not what I think would be right in a perfect world. Law enforcement agencies in the US have the right to go to a judge and get approval (a warrant/court order) to break into your house or your car or your storage cubicle or your office or whatever to look for evidence. They can even hold you until you poop if they think you've swallowed something they want. So what's the deal with mobile phones? Your house and your poop are fair game but your cell phone is not? Privacy? With a warrant the cops can tap your landline and bug every room in your house, your car, your favorite table at your favorite bar, your lapel pin, anything they want. Why are cell phones not subject to this? Scenario: What if a company started making houses that were completely break-in proof? There’s absolutely no way to get in short of nuking it, and if you try to nuke it the house itself destroys everything inside. These houses become very popular with meth cookers and kidnappers and folks who like to keep bodies in their basements and, also, normal folks who like their privacy, and millions of people start buying them. So along comes the FBI or some such saying they have a legal warrant to get into a particular house, but they can’t. There’s no way in and the house will self-destruct if they try too hard. They ask the manufacturer of the house to let them in, and the manufacturer says it can’t (and wouldn’t if it could). The cops are going to have a pretty good argument to be let in, I think, or to make it against the law to build such a house in the first place. Why is a cell phone different?


The aspect of your argument that your leaving out is the owner of the phone is no longer alive. If he were, then he could possibly be compelled by a warrant to unlock his phone or face further consequences. To me the argument is solid, unlocking one phone has the potential to allow all phones to be unlocked.
There are Law enforcement agencies around the country that have storage rooms full of phones that are salivating at the idea of being able to go through them.
It's simply a bad idea for the average person


Weak argument. You are discussing warrants at this point. In your scenario, the authorities "can get a warrant" to do all the things you list. If judges could/can sign warrants as easily as you suggest, with no basis...there's your problem. That's ok to you? A cell phone IS no different. I don't want anyone in my phone OR wiretapping me with no cause.


Since the so-called War on Drugs, the Judicial Branch has been the real weak link here. Instead of requiring a constitutional amendment, which is a bad idea, the U.S. Supreme Court since the late 1960's has perverted and distorted the "letter & spirit" of the 4th Amendment (ex: Terry v. Ohio" 1968).

Providing judicial review is the top duty of the U.S. Supreme Court and lower courts. The 4th Amendment's letter & spirit is actually one of the clearest worded amendments in the U.S. Constitution. It was meant to be chronological: an actual crime happens first, then if evidence provides probable cause to search a particular person or place the police officer/agent testified under risk of perjury to obtain a judicial warrant from a magistrate judge - the 4th Amendment outlaws "preemption policies" including the Bush Doctrine after 9/11 - which places the search before an actual crime happens.


It's not. But the government does NOT have the legal authority to, for instance, make YOU go collect someone else's poop so they can review it.

Bottom line: If the FBI wants into the phone, OK. They need to figure out how to legally do it, just like they are supposed to legally collect other evidence. It's not Apple's fault that the FBI doesn't know how to get what they need legally, just like it's not American Standard's fault the government couldn't figure out how to stop the poop they wanted from going down the sewer line.


So the government should make iOS illegal? I think a lot of iPhone users wouldn't like that. And yes, that's what they'd have to do. It's either legal or it's not. You surely don't want the government to have the power to make anything legal or illegal for each individual, do you? Do you want the law to say your neighbors are legally allowed to sell drugs because they needed the money 'real bad'?


I completely agree with everything you wrote here. However, you should acknowledge the strong arguments on the other side, including the analogy of this phone to a safe deposit box. A court can order the opening of a safe deposit box, even if a bank lacks the customer key. Brute force is used to open it. A bank will likely have to unlock a gate to give access to it. A bank has all the same reasons that Apple does to claim security interests and customer expectations of privacy in their boxes. The bank, or some technician, will have to break into something that the bank has promoted and gone to great lengths to ensure is secured, even from themselves. The court in this case could very well say that if Apple can break into it, it must. The bright line drawn by the court may be the difference between what Apple can break into, and what it cannot break into. In other words, only zero-knowledge encryption gives Apple for any other manufacturer or provider a pass from having to comply with an order.
Apple's argument that they are as attenuated to the perpetrators as Chevrolet is to someone driving their car is very weak. Even private parties can subpoena black box data that may be proprietary and require a special Chevrolet tool to extract.

Personally, I believe CALEA should be amended to make clear that Apple, and other manufacturers, are entitled to the same protections from government - forced modification or design. Then we don't have to worry about the AWA.


If the contents of the safe deposit box are coded can the police ask the bank to solve the code and decrypt the communication?


The "safe deposit box" analogy would actually be giving the government a master key to unlock every safe deposit box in the room. If a government locksmith opened one box by brute force, the box is then repaired with a new lock restoring it to security. That's not what the FBI wants to do.

Freedom has a price which is clearly written in the 4th Amendment. Requiring probable cause of a past crime, where the officer/agent risks penalty of perjury in a warrant application, ensures we have an "innocent until proven guilty" system where the burden of proof is on the "accuser" - not the accused.

Many state governments, especially in the southern portion of the United States, have operated a "guilty until proven innocent" justice system where the suspect must prove they are not guilty.

Foreign nations that do this are called "banana republics".

Yáder C.

You could be right about this, but I don't trust Apple, and nobody should. That company didn't have a problem loading undesired music into our appliances, and that's enough argument to think they only persue users' money. If Apple don't allow government agencies to break into iphones is just because they don't see a monetary benefit from customers. Forget about fake idealism, this is just about money.



Stay Informed