The Real Stakes of Apple’s Fight With the FBI

On Tuesday, the government obtained a court order compelling Apple to hack into an iPhone as part of the FBI’s investigation into the San Bernardino shooters. While the government’s investigation is an important one, the legal order it has obtained crosses a dangerous line: It conscripts Apple into government service and forces it to design and build what is, in effect, a master key that could be used as a mold to weaken the security of an untold number of iPhones.

The resulting order is not only unconstitutional, but risks setting a precedent that would fundamentally undermine the security of all devices, not just the one iPhone being debated in the news.

A bit of background is necessary to understand this debate.

As part of its investigation, the FBI has apparently obtained an iPhone 5C used by one of the shooters. The bureau has said that the phone is encrypted and protected by a passcode, and that it needs Apple’s assistance to unlock the phone. Specifically, it has asked Apple to design and write custom software that would disable several security features on the phone.

While Apple has generally cooperated in the investigation, it has refused the FBI’s latest demand to write malware that would help the FBI hack the device. To its credit, Apple has poured incredible resources into securing its mobile devices. One consequence of that effort is that Apple does not have a ready way of breaking into its customers’ devices. In the words of Apple’s CEO, Tim Cook: “We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.”

But the FBI is dismissive of that effort. According to its legal filing, the FBI believes that Apple could, if compelled, build a master key that would allow the FBI to try to break into iPhones like the one involved in the San Bernardino investigation. The FBI acknowledges that this would require Apple to write new software and then cryptographically “sign” that software (as the iPhone will accept only software updates signed by Apple).

A federal magistrate judge granted the FBI’s request the same day, but it gave Apple five days to object. Again to its credit, Apple has vowed to fight.

It is critically important that Apple win—for cybersecurity and for the fate of privacy in the digital age—for several reasons.

First, the government’s legal theory is unbounded and dangerous. The government believes it has the legal authority to force Apple into government service, even though the company does not actually possess the information the government is after. Of course, historically, the government has sought and obtained assistance from tech companies and others in criminal investigations—but only in obtaining information or evidence the companies already have access to.

The difference between those cases and Apple’s is a radical one. If Apple and other tech companies—whose devices we all rely upon to store incredibly private information—can be forced to hack into their customers’ devices, then it’s hard to imagine how any company could actually offer its consumers a secure product. And once a company has been forced to build a backdoor into its products, there’s no way to ensure that it’s only used by our government, as opposed to repressive regimes, cybercriminals or industrial spies.

Second, this debate is not about one phone—it’s about every phone. And it’s about every device manufactured by a U.S. company. If the government gets its way, then every device—your mobile phone, tablet or laptop—will carry with it an implicit warning from its manufacturer: “Sorry, but we might be forced to hack you.”

Some might accept that risk if it were possible to limit access to legitimate governmental purposes, overseen by a judge. But as Apple’s Cook points out, backdoors are uniquely dangerous: “Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.”

That risk is only growing every day as the “Internet of Things” expands. For the government, every device connected to the Internet will be more than just a novel convenience—it will be a new window into your home. The fridge that responds to your verbal commands might have a backdoor to allow for remote listening. The TV that allows you to video chat with your family might be commandeered into a ready-made spy camera.

These are the real stakes of the debate: Either American companies are allowed to offer secure products to their consumers, or the U.S. government is allowed to force those companies to break the security of their products, opening the door for malicious hackers and foreign intelligence agencies alike. For the sake of both our privacy and our security, the choice is clear.

This post was originally published by Time.

View comments (73)
Read the Terms of Use


Your article didn't come out and say what the constitutional issues are, but I see "involuntary servitude" in violation of the 13 Amendment. I hope that issue is raised, so I can see what the government's sales pitch is.

Steve B

If one judge can order Apple to create back door software to break into personal iPhones, then any judge can order the same thing (as we are already beginning to see with Cyrus Vance in NY). What's to stop courts in other countries where Apple does business from doing the same? There is no way to keep this software contained once it exists. I am against forcing Apple to create this software.


This entire issue is total Bullshit. No criminal act should ever be an ACLU issue.


You really do not understand the ACLU. And, from your comment, little else!


Wewe in your umbaya, old man.


The technical argument that Alex is trying to make has been explained in more detail by Bruce Schneier ( and Dan Guido. (

The argument regarding the FBI and the DoJ really being after the establishing of a legal precedent in the lack of the legislation that James Comey has been pushing for ( ), has also been clearly made by Julian Sanchez (

The technical aspect is perhaps more controversial and convoluted than was apparent, but Schneier is really dead clear about what it really boils down to:

"There’s nothing preventing the FBI from writing that hacked software itself, aside from budget and manpower issues. There’s every reason to believe, in fact, that such hacked software has been written by intelligence organizations around the world. Have the Chinese, for instance, written a hacked Apple operating system that records conversations and automatically forwards them to police? They would need to have stolen Apple’s code-signing key so that the phone would recognize the hacked as valid, but governments have done that in the past with other keys and other companies. We simply have no idea who already has this capability.

And while this sort of attack might be limited to state actors today, remember that attacks always get easier. Technology broadly spreads capabilities, and what was hard yesterday becomes easy tomorrow. Today’s top-secret NSA programs become tomorrow’s PhD theses and the next day’s hacker tools. Soon this flaw will be exploitable by cybercriminals to steal your financial data. Everyone with an iPhone is at risk, regardless of what the FBI demands Apple do."

Yesterday Cryptographer Micah Lee was already advising iPhone users to use an extra long 11-digit password (, but how effective that could really be in light of what Schneier and Guido have added, and the fact that it could all come down to a firmware update, should be clarified before we get too comfy and decide to apply the trust mode again.


All of you objecting the actions of the ACLU, must read and understand the Constitution of the United States of America.


The Constitution give the government the right to search after a warrant is obtained. Apple could make a software update disabling the feature for 30 days. It wouldn't be a master key any more than any update they have created. It could be kept where Apple keeps its other software and installed on any phone after the government presents a valid warrant. IMHO, Apple is making too big a deal over this. If they had quietly done this no one would be the wiser and this would be over.

Ted Voth Jr

Reading the commentaries Im amazed at what craven toadies the descendants of the citizens of 1776 have become: wed better change that line 'the land of the free and the home of the brave' to 'not the land of the free but the home of the slave'.
The article says 'there’s no way to ensure that [the technology] is only used by our government, as opposed to repressive regimes,' seemingly unaware that our government has itself become that repressive regime it speaks of. Its supremely ironic that the repressive the extreme right fears now exists it the very hands of their very own whacked out corporate lackey politicians


We have to start using some common sense. Right now there are many criminals including terrorists using these smart phones that are encrypted. These people are using these phone to talk to their fellow conspirators to do great harm to us. We have to find the right balance between privacy and security. Having a device that criminals can use against us without being able to access its contents through a warrant is just plain nuts. Ideally this product would be removed from the market place, however that it is unlikely. We cannot talk about Constitutional rights in a vacuum. The victims of these criminals have rights too. It is sad that we have people who want to do harm to others. That is the real problem that we have to solve. Law enforcement needs to have the proper legal tools with proper judicial oversight to do its job. Criminals should not be able to hide behind constitutional technicalities.


Stay Informed