The Real Stakes of Apple’s Fight With the FBI

On Tuesday, the government obtained a court order compelling Apple to hack into an iPhone as part of the FBI’s investigation into the San Bernardino shooters. While the government’s investigation is an important one, the legal order it has obtained crosses a dangerous line: It conscripts Apple into government service and forces it to design and build what is, in effect, a master key that could be used as a mold to weaken the security of an untold number of iPhones.

The resulting order is not only unconstitutional, but risks setting a precedent that would fundamentally undermine the security of all devices, not just the one iPhone being debated in the news.

A bit of background is necessary to understand this debate.

As part of its investigation, the FBI has apparently obtained an iPhone 5C used by one of the shooters. The bureau has said that the phone is encrypted and protected by a passcode, and that it needs Apple’s assistance to unlock the phone. Specifically, it has asked Apple to design and write custom software that would disable several security features on the phone.

While Apple has generally cooperated in the investigation, it has refused the FBI’s latest demand to write malware that would help the FBI hack the device. To its credit, Apple has poured incredible resources into securing its mobile devices. One consequence of that effort is that Apple does not have a ready way of breaking into its customers’ devices. In the words of Apple’s CEO, Tim Cook: “We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.”

But the FBI is dismissive of that effort. According to its legal filing, the FBI believes that Apple could, if compelled, build a master key that would allow the FBI to try to break into iPhones like the one involved in the San Bernardino investigation. The FBI acknowledges that this would require Apple to write new software and then cryptographically “sign” that software (as the iPhone will accept only software updates signed by Apple).

A federal magistrate judge granted the FBI’s request the same day, but it gave Apple five days to object. Again to its credit, Apple has vowed to fight.

It is critically important that Apple win—for cybersecurity and for the fate of privacy in the digital age—for several reasons.

First, the government’s legal theory is unbounded and dangerous. The government believes it has the legal authority to force Apple into government service, even though the company does not actually possess the information the government is after. Of course, historically, the government has sought and obtained assistance from tech companies and others in criminal investigations—but only in obtaining information or evidence the companies already have access to.

The difference between those cases and Apple’s is a radical one. If Apple and other tech companies—whose devices we all rely upon to store incredibly private information—can be forced to hack into their customers’ devices, then it’s hard to imagine how any company could actually offer its consumers a secure product. And once a company has been forced to build a backdoor into its products, there’s no way to ensure that it’s only used by our government, as opposed to repressive regimes, cybercriminals or industrial spies.

Second, this debate is not about one phone—it’s about every phone. And it’s about every device manufactured by a U.S. company. If the government gets its way, then every device—your mobile phone, tablet or laptop—will carry with it an implicit warning from its manufacturer: “Sorry, but we might be forced to hack you.”

Some might accept that risk if it were possible to limit access to legitimate governmental purposes, overseen by a judge. But as Apple’s Cook points out, backdoors are uniquely dangerous: “Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.”

That risk is only growing every day as the “Internet of Things” expands. For the government, every device connected to the Internet will be more than just a novel convenience—it will be a new window into your home. The fridge that responds to your verbal commands might have a backdoor to allow for remote listening. The TV that allows you to video chat with your family might be commandeered into a ready-made spy camera.

These are the real stakes of the debate: Either American companies are allowed to offer secure products to their consumers, or the U.S. government is allowed to force those companies to break the security of their products, opening the door for malicious hackers and foreign intelligence agencies alike. For the sake of both our privacy and our security, the choice is clear.

This post was originally published by Time.

View comments (73)
Read the Terms of Use

Lindsey Brutus

Why can't the government give Apple the phone and let Apple give them the info off the phone without giving them anything else? Then Apple can destroy the phone and the government can not snoop on anyone. Does the government not trust Apple? We certainly, sometimes, we do not trust the government!

Anonymous

We already have examples of government bodies using "backdoors" for poorly thought out purposes that are certainly privacy infringing. The example that comes to mind is several school districts installing software to remotely access cameras on student laptops.

Anonymous

To all those that believe Apple is right, i pose this question. If your love one or possibly you were abducted by some terrorist group and strapped with a bomb that was set to go off in 24 hours and the only way to disable it was to obtain the disable code locked in the Apple phone that was used by one of the terrorist group members, how would you feel then? Its easy to take the high road and site privacy concerns, etc until it directly affects you. At the end of the day, Apple doesn't care about privacy, it cares that consumers may stop buying the phone because it can broken into and they might lose business. I am certain the only people they would lose business from are those most concerned about hiding stuff they don't want out

Anonymous

If Apple is concerned with losing business, why is that not a legitimate concern? The FBI has tried and failed to persuade Congress to ban phones with unbreakable encryption, so Apple made an entirely legal product. This leads to another issue. Who is going to pay for the lost sales, damage to the brand and so forth? From what I can tell, the government is willing to pay for the engineering costs of making an alternative firmware, but not for the damages in terms of lost sales, damage to the brand and so forth. That might amount to a "taking" without "just compensation".

Rich Garella

Your argument applies equally for paying ransom to hostage-takers. Of course any one of us, if our own life or that of a loved one were at stake, might want to make a decision that would be disastrous in the long run for our entire society. It would be entirely understandable, and yet entirely wrong to act on that basis.

Anonymous

Your hypothetical example smells of typical government propaganda to us. Has that scenario ever happened in the entire history of humanity, and will it ever happen in the future? No and no. On the other hand, has the government ever abused it's power to arrest, attack, imprison, and deny civil rights to innocent citizens? Every second of every day, 24/7/365.

Anonymous

I'm so tired of people going using this analogy because it absurd. I am against torture and the death penalty, so naturally, my conservative friends immediately go to, if your child were kidnapped would you support someone being tortured find him. If your child were killed would you support the death penalty. In both cases I have to answer this:
1) I don't know it's hard to say what one would do in the most extreme of emotional distress but
2) probably I would not only support but carry out the torture and killing myself if given the opportunity and
3) but this commentary is actually pointless and irrelevant to the ACTUAL argument of whether a society and government should make AS LAW the right to do these things.

In other words, it's a horrible idea, perhaps the worst ever, to suggest that we should make our society's laws based off what we DO in the event our children were just murdered. Seriously? That's how you or anyone thinks we should run our country. There's a good chance I would press every nuclear button on the planet if my kids were killed, so I suppose that's now good policy? Dumb dumb dumb.

As painful as it is for people to have to go through some of the horrible and evil things that other people do, it's even worse to have a government which is supposed to be the collective reason of the society, sanctioned to do the same evil things we so desperately want people to stop doing to each other.

Sean

Anonymous

Good points but part of the problem is that the American general public isn't seeing the complete portrait of facts and evidence.

In other words how many innocent Americans have been killed or died prematurely due to CoinTelPro style blacklisting tactics and programs? We don't know how many people die from domestic spying. My guess is that number far outweighs the number of innocent people saved.

For example: CoinTelPro blacklisting, intentionally or not, today is almost identical to the blacklisting program perpetrated by East German Stasi (communist secret police). During the Cold War this type of blacklisting program and tactics resulted in one of the highest death rates in Europe. Essentially the communist model (now adopted by the U.S.) punished political speech and even political association using the tactics of employment tampering, slander and other defamation to destroy innocent citizens - they destroyed the target's livelihood, marriages, family ties, friendships and destroyed the reputations of innocent citizens - this CoinTelPro/Stasi style blacklisting pushed most to suicide. Since targets are intentionally never confronted, judges can't police it.

Domestic spying is deadly on a mass scale. The best way to prevent it is by requiring probable cause of a real crime before allowing bureaucrats and other officials to snoop into your private life. Due to mission-creep government agencies must always manufacture new threats and new enemies, they need enemies to justify their agency budgets.

American voters are not seeing the lives lost to this type of blacklisting and domestic spying not based on any wrongdoing, so privacy can also save lives.

Anonymous

"i pose this question."

The question that you pose is a logical fallacy known as a "straw man." You can pose any question that you please and so can I.
Why should I feel anything more about your straw man than I should feel about the possibility that someone that I love is going to die at some unknown point in the future? Your straw-man scenario is extremely unlikely to happen, so extremely unlikely that bothering about it is nonsensical. On the other hand, there can be *no doubt* that, in the future, someone that I love dearly will die. And there is nothing that any arm of government can do to Apple that can possibly prevent that from happening.
Consider a scenario in which the scenario that you propose does not happen. That's my preferred scenario. And, of course, you are free to prefer a different scenario.

Rex

If you want to go to that extreme of hypothetical ...what if superman had landed in Nazi era Germany instead of Kansas and was raised by Joseph Goebbels? The FBI would then be helpless even with Apples assistance and we would all be even more doomed than the fear-mongering lame stream media would have us believe.

Pages

Stay Informed