Back to News & Commentary

Lessons From the Celebrity iCloud Photo Breach

Chris Soghoian,
Principal Technologist and Senior Policy Analyst,
ACLU Speech, Privacy, and Technology Project
Share This Page
September 2, 2014

Based on initial media reports, it seems that intimate, private photographs from several celebrities’ online accounts have been accessed without their consent and widely shared on the Internet. For now, many details about the breach (or breaches) remain unclear. One working theory, which is supported by anecdotal evidence, suggests that a security vulnerability in Apple’s iCloud service may have been exploited to gain access to the celebrities’ accounts and download their photos.

The blame game

In the flurry of news after the photos surfaced, several commentators smugly suggested that some blame should fall on the victims, either because they used weak passwords, or because they were using their phones to take sexually explicit photographs. This is ridiculous.

These celebrities exhibited behavior that is perfectly normal. As researchers like Joseph Bonneau have documented at length, most people choose bad passwords, and reuse them for multiple accounts. Similarly, the fact that these celebrities took sexually explicit photographs of themselves or were photographed by their partners using mobile phones is just further evidence that deep down, celebrities are just like the rest of us. As the old saying goes, the best camera is the one that’s with you, and as our cell phones have morphed into tiny computers with the ability to shoot photos and movies, it isn’t surprising that people are using them to capture private moments too.

For the victims whose privacy has been violated, this experience is awful. For the rest of us, it can be a teaching moment and an opportunity to think about what we expect from the companies that build the devices and online services we trust with our most private information.

Could Apple have prevented this?

According to media reports, a long-standing vulnerability in Apple’s “Find My iPhone” service was exploited to gain access to iCloud accounts. Many online services will temporarily lock access to individual accounts after a few failed login attempts, in order to prevent a so called brute force attempt to crack an account’s password by repeatedly trying common passwords until the correct one is discovered. Most of Apple’s services had used such a rate-limiting mechanism, except the Find My iPhone service. Apple has, over the past few days, fixed this issue.

In the days and weeks to come, Apple will no doubt be justifiably criticized for failing to protect the Find My iPhone service with a rate-limiting mechanism. There are, however, other deeper issues worth probing, such as the default security settings that mobile phones ship with, and the extent to which these devices and synchronized online services can withstand an attack by determined adversaries.

One password to rule them all

It is likely the case that many of the victims also had poor quality passwords, which increased the ease with which the hackers could gain access to their accounts. The use of poor, low entropy passwords is not specific to Apple accounts – but Apple requires their customers to regularly enter their password on their phones whenever they wish to download an app from the company’s App Store, even for free apps. This encourages users to pick short, easy-to-enter passwords.

No doubt, Apple’s privacy and security teams will be carefully analyzing the security of their authentication systems as a result of this incident. Apple should seriously consider permitting users to have a short, easy-to-remember password or PIN to install apps from the app store for on-device entry, which will allow them to have a longer, higher-quality password for remote access to iCloud.

The downside to default, automatic cloud backups

It appears to be that iOS devices are automatically opted-in to Apple’s Camera Roll feature, which uploads all photos to Apple’s iCloud backup service. As a result, many users are likely using this service without realizing it and a result, do not understand the associated security and privacy risks.

There are, no doubt, useful aspects to nudging users towards automatic online photo backups – they ensure that a lost or stolen iPhone does not result in the permanent loss of photos, without requiring that the device owner first configure a backup service. Similarly, photos taken during a protest are instantly archived online, which can be particularly useful if police seize phones or force people to delete photos they have taken.

Automatic online backups of photographs may be appropriate for photos of your friends, kids, and pets. However, given that people also routinely take intimate, private photos with their smartphones, automatic backups may not always be desirable. One obvious solution to this is to provide users with an easy way to take private photos that won’t be uploaded, while still offering the convenience of automatic backups for the majority of photos that aren’t sensitive.

The need for a private photo mode

Apple, Google, Microsoft, and Mozilla already include “private browsing” modes in their web browsers. Clearly, these companies recognize that there are certain activities that their customers will engage in online that should remain private (or at least should not be revealed in the browser’s history).

One thorny problem with these private browsing modes is that the companies steadfastly refuse to publicly acknowledge how they are actually used – that is, instead of recognizing that they are used by millions of people to look at pornography, the companies instead describe them as being useful for shopping for engagement rings or looking up health information. No doubt, these are occasional uses, but they aren’t the majority use. The companies know this, but they don’t want to admit it.

This prudish approach to describe private browsing may make life easier for the companies’ marketing departments, but it also seriously undermines user education efforts when the companies refuse to describe how their products and services are actually used. Effective privacy education should not be communicated with a nudge and a wink.

Apple, Google and the other big tech companies should acknowledge that millions of their customers regularly use their products to engage in sensitive, intimate activities. These companies can and should offer a “private photo” option for sensitive photos that prevents them from being uploaded to the cloud. More importantly, they should treat their customers like grownups and educate them about how they can use their products and services to engage in intimate activities, as safely as possible.

Learn More About the Issues on This Page