Apple’s Use of Face Recognition in the New iPhone: Implications

Apple unveiled its new iPhone X Tuesday, and it will include extensive face recognition capabilities. Face recognition (as I have discussed) is one of the more dangerous biometrics from a privacy standpoint, because it can be leveraged for mass tracking across society. But Apple has a record of achieving widespread acceptance for technologies that it incorporates into its phones. So what are we to think of this new deployment?

The first question is whether the technology will be successful. Face and iris recognition technology incorporated into some other phones (such as Samsung’s) has widely been seen as insecure and/or impractical. In recent years Apple has acquired several face recognition companies, and the company claims the new phones’ capabilities will include the ability to measure 30,000 infrared, three-dimensional points on a user’s face, the ability to distinguish a real face from photographs and masks, and extremely low false-positive and false-negative rates. But only time will tell if people find the face recognition functionality on this and other phones to be practical enough for widespread adoption.

Even if it is successful, some of the first-order privacy implications of Apple’s new deployment can be overblown—mainly, the collection of user face data through the iPhone unlocking function. First, Apple has said that the face recognition data will be stored locally on users’ phones, and not transmitted to a central database. Second, for the time being there are far bigger, more comprehensive collections of individuals’ photographs, including the state DMV databases, and photo databases maintained by the State Department and Customs and Border Protection, not to mention Facebook and Google, which store billions of photographs. For mass-surveillance purposes, those photographs would probably serve just as well as Apple’s 3-D face maps.

Of course, whatever promises Apple makes today could be rolled back in the future, not to mention ignored by other companies if the technology becomes standard. Our big worry is that face recognition will be used to identify and tag people in new, privacy-invasive contexts, leading ultimately perhaps to a pervasive system of identification that tracks Americans in their every movement. Face recognition from mobile phone unlocking could certainly in the future become a key part of such a surveillance infrastructure.

Still, at the end of the day, while storage of face templates for phone unlocking is what worries many people up front, that is not an immediate threat.

Normalization

Many people have worried that Apple’s launch of face recognition  will socialize Americans into accepting face recognition as a part of their everyday lives, thereby making far spookier deployments much easier down the road.

There is probably some truth in that scenario, but it’s not clear whether intentionally staring into one’s own phone will leave people comfortable with being tagged as they look into a store window or walk down the street. There is a difference between a technology we control and one that is applied to us as a power play. In this case, face recognition does not leak data to Apple and is under the control of the phone’s owner, who can turn it off, and who benefits directly (if it works well) from the convenience it provides. I don’t think people will have much trouble differentiating between empowering and disempowering uses of the technology, and I expect each use of face recognition technology will proceed according to its own cost/benefit logic; just because it may prove to make sense and be accepted to unlock phones does not mean it will make sense and be accepted on street cameras.

Broader uses

That said, broader applications of face recognition beyond unlocking phones are more likely to have a normalizing effect—and pose privacy problems. Everybody is focused on how face recognition in the new iPhones will be used for unlocking the phone. But the new face recognition functionality will also be accessible to other applications. As Fast Company reported, the new phone’s face recognition capability will be “laced throughout” the operating system:

When coders dug through Apple’s beta versions of iOS5 they found what were deemed to be “highly sophisticated” API systems that let an iPhone automatically track eye positions and mouth positions (so the angle to the user, and possibly where their attention is being directed could be calculated) as well as passing key data on to a face recognition algorithm that would be accessible to all apps…not just Apple’s own ones.

That means several things. First, the privacy policies Apple applies to the phone-unlocking application of face recognition won’t necessarily be embraced by giant, more advertising-oriented companies like Facebook and Google, whose apps reside on many Apple phones and may tap in to this functionality. Second, it means that a thousand face-recognition flowers will bloom as small app developers explore all manner of possible uses of the technology, no doubt including many that nobody has yet thought of. Also no doubt including some that will have dodgy privacy practices.

For example, Carnegie Mellon researchers in 2011 showed that face recognition could be combined with social networking data to identify people walking around in public and provide instant information about their interests based on their social media data. It’s not hard to imagine apps being developed in that direction—apps that could be exploited by government, companies, or for interpersonal surveillance.

Gaze detection & attention tracking could be another big use. In that case there may be both a strong financial incentive for deployment, and the generation of significantly more intrusive information about users than photographs of their face. As I discussed in a 2013 post about eye tracking, the technology can be used not only to minutely measure interest and attention for marketing purposes, but also potentially to provide measurements of intelligence, sexual preference, and drug and alcohol use, as well as the presence of cognitive disorders and mental illnesses.

In short, if phone unlocking doesn’t acclimate people to surveillance via face recognition, the emergence of a far broader and more intrusive set of phone-based applications of the technology might, in the absence of strong legislative or other privacy protections.

Other questions and concerns

  • The always-on nature of the phone’s camera. Some have raised concerns over the fact that, in order for users to “wake up” their phones with their faces, the phones’ cameras will need to be always on. As I discussed with regards to other “always on” technologies such as the Amazon Echo, this can raise questions about security and unintended collection. One question would be what kind of information  the phone logs. For example, does it store any information about faces that might appear before a phone’s screen that do not belong to its owner?
  • Security questions. Apple claims that its implementation of face recognition for phone unlocking won’t be fooled by such things as photographs or masks, and that face recognition data will be stored in phones’ encrypted, hardware-protected “Secure Enclave.” Nevertheless experts say it’s inevitable that people will figure out how to break those protections. That doesn’t mean that it won’t be worth using; security is always a tradeoff against other values such as convenience. But if significant vulnerabilities emerge, they  could render the technology impractical.
  • Questions of consent. Users will reportedly be able to turn off the face recognition functionality, but the larger question is whether other applications may nonetheless be able to use it, or more broadly whether the mobile phone ecosystem will evolve toward a system where users are all but compelled to turn it on if they want to make realistically full and convenient use of their phones and not be shut out of a lot of functionality.
  • Law enforcement access. One outstanding question is whether and under what standard law enforcement would be able to access face data. There is no case law on whether a person’s facial imprint is protected under a probable cause warrant standard, and law enforcement has argued that the warrant requirement doesn’t apply to certain other data (including location data and, in the national security context, communications metadata).
  • Racial differences. As has been widely reported, many face recognition implementations have repeatedly been found to be more accurate with light-skinned people than with dark-skinned people. Has Apple solved that problem, or will dark-skinned people have more trouble getting their phones to recognize them?

Assuming that this face recognition implementation proves practical and becomes widespread, many people are going to need to decide whether to use it. Unless it is proven otherwise, consumers should probably assume that it’s less secure than entering a passphrase. Of course, all security is a tradeoff, and the convenience it provides may prove worth a security hit for many people who are not worried about someone creating a 3D mask of their face in order to get into their phone, or whatever else proves necessary to defeat it.

Another factor is the 5th Amendment right against self-incrimination. Some courts have ruled that if law enforcement has a warrant to search your phone, they can require you to provide your fingerprint to open it up, reasoning that biometrics are identifiers, not testimony. But under the current cases, it is more difficult for the government to force you to divulge a passcode. The doctrine is both murky and still developing, but it's likely that courts would see face prints the same way they've seen fingerprints. On the other hand, Apple does offer a “cop button”: if the user presses the power button 5 times, the fingerprint or face print authentication is disabled, requiring entry of a passphrase.

Face recognition is becoming a hot technology, with many seeing it as exemplifying “the future” —kind of like nuclear power and flying cars were seen in 1954. That means everybody is trying to figure out how they can use the technology. Many of those attempts will fail as face recognition proves too impractical or intrusive to win acceptance, but it will find its niches and our phones may be one of them. If so, the question becomes, as always, who will be most empowered by it?

Add a comment (21)
Read the Terms of Use

Silvanna Finnerty

Is your article based on a real event? Because the same thing has been happening to me in North Idaho for over 3 years. I am a ACLU Boise and national contributor from Bonner County.

The Future

Unless you reject this technology it will become the future! Laws must be enacted NOW that ensure a physical image, DNA, finger prints, or other bio-metrics cannot and must be required by a device as a "security code".

For real, this should be the ACLUs number one priority! I'm not saying that these devices should be banned from using the technology and offering it as an option to users. The key word is OPTION; if I don't want to use that type of security but still use the device I should be able too.

One day device manufacturers might require a face login to use Apple Pay or google play etc. No free market ma and pa device manufacturer will be able to compete and offer a non intrusive device. I'm sorry but I see that path as the future of humanity.

TurboJet70

Amen. Our friends at the ACLU need to be focusing on matters such as this a whole lot more and matters such as children who were brought into this country ILLEGALLY a whole lot less.

eD! Thomas

I don't disagree with the overall thesis of this article, but some of your assumptions are incorrect on the technology behind Face ID (and TouchID, for that matter).

For instance, the camera is *not* always on -- it is activated when the lock screen is active, which requires either picking up the phone (so it will be powered on via the accelerometer) or by pressing the side button. That said, the microphone *is* always on so long as you enable "Hey Siri", so being cautious of that is advisable.

The implementation of facial recognition is also seemingly misunderstood here -- the actual logging in isn't based on a traditional picture, as it is on other phones using the same technology, but on a 3D model of your face and, more importantly, infrared scans. This mitigates the risk of a picture being used to unlock your phone as they don't give off body heat or have physical depth to them. (The mask thing is weird and I have no idea how they work around that other than no mask is going to allow for body heat in all the right spots, y'know?)

As for the Secure Enclave, apps cannot read or write to it; only iOS has access to that information. When you use Touch ID currently to access an app, it is passing a call to the Secure Enclave to verify the user is who they say they are -- it doesn't get any information other than a yes or no. This prevents apps from being able to use that information if the user decides to not enable Face ID or Touch ID.

Finally, the "Cop Button" you mentioned on the iPhone X is much easier -- just hit all the side buttons and Face ID is disabled until you log in with a passcode, which is much easier (and less noticeable) than hitting the side button five times.

Anonymous

https://youtu.be/N81TVMd8k_E

So easy a kid can do it! How long before iPhone X is hacked? 6months a year. Probably around 18 months after release I'll be able to watch you through your camera and listen to your microphone. No security code or face ID needed.

Use it or lose it.

Trent P. Reznor

Anonymous, I really hope you aren't serious with that link or that you believe anything that kid said in the video. As eD! said aboce, Apple does actually take great cautions with it's users privacy. It's been one of the major proponents that have been fighting government and law enforcement threats to create backdoor access on their devices. I currently trust Apple way more than Google with the privacy of my data.

Vincent

I agree it's a worrisome "innovation" for its potential abuses. But just like opening your laptop or phone with your fingerprints posed similar risks, adoption has been pretty low. And I'm not convinced people are ready to stare into their phone to unlock it. In my opinion, it's more of a gadget. Apple hasn't been able to truly innovate since Jobs' passing, and it's desperately looking for new "features". There has to be something new the press can talk about. But it's doubtful this is the advent of a new way to authenticate for most of us.

What's much more worrying is the data your phone can extract from your photo albums and compare with others to recognize you and other people in your photos. Most people already take pictures of themselves, and the technology to compare their face with others already exists.

Kind regards,
Vincent
http://www.vincentbettschart.ch

Anonymous

It's important to remember that many, if not most, investigations by police and federal agencies are NOT governed by the 4th Amendment. Many times these officials and contractors investigate and punish legal 1st Amendment activity.

For example: officials trolling Facebook, bypassing your privacy filters, violates the 4th Amendment's letter & spirit. If you were to simply "like" police body-cameras or oppose tanks/grenade launchers in your local police department - you may be investigated and even punished for that legal 1st Amendment exercise. Even though all officials, including police, voluntarily agree to the constitutional oath of office, many seem to have no regard for their employment contract which grants them authority.

The lesson is that government officials and their contractors are NOT always using these technologies for crime fighting or to protect the public. Many times, if not most, it is simply to silence and disrupt legal 1st Amendment activity. This technology faces the same fraudulent abuse by both government and private entities.

Joe S.

Thank you. So glad you brought up this excellent point.

Anonymous

On the plus side, we will no longer all look the same to the A.I.

Pages

Stay Informed