Spies Want to Make the FaceTime Eavesdropping Bug Into a Feature

On Monday, we learned that Apple’s FaceTime video chat service suffers from a bug that permits other people to get audio and even video directly from your iPhone or Mac computer. This can happen without your permission and without the standard indication that the other person is listening and watching. Anyone with FaceTime could eavesdrop on any other FaceTime user by simply calling and performing a simple operation — and the victim’s device would start transmitting, even if they never accept the call.

This is a fairly catastrophic bug. Yet alarmingly, if major national spy agencies get their way, a comparable bug will become a standard feature in almost every popular communications product currently in use. As incredible as that might seem, we know this because they’ve told us so.

The FaceTime bug is a failure in the user interface — the parts of the software that make the user aware of and in control of what the device is doing. FaceTime’s user interface fails in at least two ways that are related, but distinct. First, it sends audio and video to the attacker without the victim’s permission — the transmission starts without the victim approving it. Second, it does so without victim’s knowledge — the normal indication that an active call is underway is absent.

The engineering community has understood for years that user interface failures are a frequent cause of security failures and that these failures are often worse than others. There are organizations, books, and meetings dedicated to working on trustworthy and secure user interfaces, and Apple itself has guidelines that reinforce the importance of the user interface in security software.

But officials from Britain’s Government Communications Headquarters (GCHQ) — a close surveillance partner of the U.S. National Security Agency — recently proposed that government agents be able to inject hidden participants into secure messaging services. This proposal has come to be known as the “Ghost proposal.”

Written by GCHQ’s Ian Levy and Crispin Robinson, it recommends institutionalizing an untrustworthy user interface when the government wants to spy on a conversation:

It’s relatively easy for a service provider to silently add a law enforcement participant to a group chat or call. The service provider usually controls the identity system and so really decides who’s who and which devices are involved — they’re usually involved in introducing the parties to a chat or call…. In a solution like this, we’re normally talking about suppressing a notification on a target’s device… and possibly those they communicate with.

In short, Apple — or any other company that allows people to privately chat — would be forced to allow the government to join those chats as a silent, invisible eavesdropper. Even the most secure apps like Signal (which we recommend) and WhatsApp, which use end-to-end encryption, would be rendered insecure if they were forced to implement this proposal.

mytubethumbplay
%3Ciframe%20allow%3D%22accelerometer%3B%20autoplay%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20frameborder%3D%220%22%20height%3D%22315%22%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2FvSQQXS3q1k8%3Fautoplay%3D1%26version%3D3%22%20thumb%3D%22%2Ffiles%2Fweb19-protect-digital-privacy-thumb-560x315.jpg%22%20width%3D%22560%22%3E%3C%2Fiframe%3E
Privacy statement. This embed will serve content from youtube.com.

The Ghost proposal institutionalizes a significantly worse user interface failure than Monday’s FaceTime flaw. With the FaceTime bug, the vulnerable user at least gets an alert about an incoming call to know that something is happening, even if the user interface is misrepresenting the situation and violating the user’s expectations. With the Ghost proposal, the user has no way of even knowing that something is happening that violates their expectations.

The GCHQ authors claim that Ghost provides law enforcement with wiretap-like capability, and “you don’t even have to touch the encryption.” This is true, but only in the most disingenuous sense.

When people want encryption in their communications tools, it’s not because they love the mathematics. People care because of what encryption does. Encryption and other cryptographic protocols are necessary to protect people through properties like confidentiality, integrity, and authenticity. The Ghost proposal essentially says, “Let us violate authenticity, and you can keep encryption.” But if you don’t know who you are talking to, what security guarantee is left?

Cryptography is necessary to ensure these properties, but it is not sufficient on its own. The entire system, from the cryptographic mathematics to the software implementation to the network protocols to the user interface, is critical to providing secure communications in an increasingly hostile online environment.

And let’s not forget: If companies like Apple are compelled to enable governments to participate silently in private conversations, that tool won’t be available only to democratic governments — it will be employed by the world’s worst human rights abusers to target journalists, activists, and others.

We should be clear: All software has bugs, and Apple’s software, as good as it is, is no exception. Although it took too long for Apple to recognize the flaw, the company is now treating it with the gravity it deserves.

Since the vulnerability is accessed through Group FaceTime, Apple has taken those servers entirely offline until the FaceTime app itself can be fixed. But any connected FaceTime app is still currently vulnerable if Apple chooses to re-enable the Group FaceTime servers, so until an upgrade is shipped, people should probably keep FaceTime disabled. (This is a good reminder of why it’s important to install new software updates as soon as they’re available)

That such a serious flaw could be discovered in the software of a company known for prioritizing privacy should be a warning to anyone, including GCHQ and NSA, who advocates for intentional security flaws to facilitate government surveillance. It’s very difficult to engineer software correctly in the first place, and it's even more difficult to design it with intentional flaws, however limited. If a mechanism exists to deliberately make the user interface untrustworthy, it will be an attractive target for malicious hackers and other hostile actors. Who will be responsible for its inevitable abuse?

Any future discovery of a software flaw that enables eavesdropping, false identities, message tampering, or any other compromise of communications security should be treated the same way as this latest weakness: with serious emergency mitigations, followed as soon as possible by a software update that removes the flaw. And governments certainly shouldn’t consider adding such vulnerabilities on purpose.

View comments (11)
Read the Terms of Use

Anonymous

There may be another important reason for archiving all electronic surveillance - by any agency - with an independent archiving agency. Covert surveillance also robs it’s victims of “legal standing” in court as plaintiffs. Many Americans think warrantless spying is harmless - it can be lethal. Anyone on a watchlist or blacklist is punished by all types of officials. If you are unlucky enough to be blacklisted, you are treated as guilty - not as a suspect with doubt. The local Barney Fife will usually dish out punishment and harassment to blacklisted Americans as will federal officials. If all surveillance was documented and archived, a judge could provide relief to these blacklisted Americans. Blacklisting is also a war crime (usually exercised using wartime powers) so any official - from police to FBI - can also be criminally prosecuted decades later. Without an independent archiving agency, like the National Security Archives, blacklisting crime victims have a near impossible time seeking justice. American agencies have a long documented reputation for perpetrating these abuses against their own fellow citizens. As Bush said: “if the [officials] aren't doing anything wrong, they have nothing to worry about”.

Ms. Gloria Anasyrma

Once again the term "blacklist" should not be used as it is offensive to our African-American brothers and sisters. I suggest the them "purplelist" should be used. Of course the government does not need to be purplelisting its citizens.

Anonymous

“Black” refers to being “covert”, not a color. If these tactics were lawful and legitimate, it would done in the daylight and confront suspects in a court of law. Since blacklisting is neither lawful nor legitimate, police and officials never want a judge to oversee their activities. It has nothing to do with race.

Anonymous

As one of the Founding Sponsors of the Martin Luther King, Jr. monument on the National Mall in Washington, DC - the term “blacklist” was in no way meant in a racist context. Private donors paid for the entire construction of the MLK monument, it wasn’t government funded.

Ms. Gloria Anasyrma

I am a sensitive snowflake by the way.
I am also very insensitive to Jews, thats why I suggest purple.

Ms. Gloria Anasyrma

You are not the real Ms. Anasyrma. Get your own nom-de-plume you mule loper.

Ace of spade

Back or white was always a marking of good and bad. Before American history and savory happened. Stop with this nonsense!

Mike

I'm offended.

Melvin Schultz

What do you think Alexa, Cortana, and all these other microphones, cameras and 24/7 reording and transmitting devices do in your house, continuously? Do you REALLY think they only transmit what you say and the noises you make when you tell them to? Are you that stupid and bainwashed?

Anonymous

This is 1776 all over again.
Americans, it’s time to end fiveyes alliance.

Pages

Stay Informed