Back to News & Commentary

FTC Weighs In On Face Recognition Technology

Jay Stanley,
Senior Policy Analyst,
ACLU Speech, Privacy, and Technology Project
Share This Page
October 24, 2012

The FTC on Monday released a staff report on Face Recognition, offering “best practices for common uses of facial recognition technologies.” The report resulted from a workshop the agency held on the issue last year. Face recognition is in some ways the ultimate biometric identifier, and its potential to finally and decisively put an end to the possibility of anonymity in public is very real.

The FTC report offers a useful rundown of the range of possible uses for the technology, including:

  • Face detection (locate a face in a photo)
  • Mood or expression analysis (are you worried or relaxed?)
  • Demographic analysis (what is your age range or sex?)
  • Identification (who are you or have we seen you before?)

Obviously it’s identification that raises most of the privacy issues. The New York Times ran a story Monday on a new legal challenge to a New York law banning the wearing of masks at protests. Believe it or not, in New York it is illegal for three or more people to wear masks in public. The article doesn’t mention face recognition, but if the technology enters into broader use such laws will take on much more importance.

The FTC recommends that companies using face recognition “design their services with privacy in mind.” Specifically:

  1. Provide good security for images and other biometric data, including measures to prevent unauthorized scraping of images for uses not authorized by the subject.
  2. Establish “appropriate retention and disposal practices” for consumer images, such as disposing of data that is no longer needed.
  3. “Consider the sensitivity of information” when developing products—for example not putting cameras in sensitive areas such as bathrooms or places where children congregate.
  4. Provide consumers with simplified and transparent choices. Companies deploying signs that utilize cameras (as in Minority Report), for example, “should provide clear notice to consumers that the technologies are in use, before consumers come into contact with the signs.” Social networks should provide clear notice and meaningful control including the ability to opt out and delete data.
  5. Obtain a consumers “affirmative express consent” before a) using his or her image “in a materially different manner than they represented when they collected the data,” or b) to “identify anonymous images of a consumer to someone who could not otherwise identify him or her,” such as a stranger in a bar.

All good guidelines, though of course as the report itself notes, none of it is binding unless a company promises to do one thing and then does another (at which point the agency’s authority to police “unfair or deceptive trade practices” kicks in).

More fundamentally, the FTC’s guidelines do not reach to the heart of the issue, which is whether the de facto anonymity that most people have enjoyed in urban areas since the dawn of industrialism will come to an end through the use of face recognition technology and ever-more-pervasive video surveillance cameras to routinely record everyone’s comings and goings.

Learn More About the Issues on This Page