Biometrics Industry: Anonymity is Forfeit

The Department of Commerce has convened a “multistakeholder process” between civil society groups (like the ACLU) and industry groups, with the aim of limiting face recognition as a tool of surveillance in our society by establishing common ground and creating agreement on core principles that would allow face recognition to be used in a controlled and responsible way.

The big question in such a process is: what does industry think? If you’re building powerful new surveillance technologies, what rules of the road will you accept to guide their operation and use?

Now we’ve got the beginnings of an answer to that question in the form of a whitepaper from the International Biometrics & Identification Association (IBIA), an industry group that represents the venders of powerful new tracking technologies such as face recognition and iris scanning. It’s the clearest indication I’ve ever seen of how the biometrics industry would like surveillance to expand.

We’re off to a rocky start. In its Privacy Best Practices Recommendations for Commercial Biometric Use, the IBIA essentially denies that there are any real privacy problems with biometrics and face recognition except for the basic need to be transparent and keep information securely.

The document really speaks for itself, so I’m just going to offer some quotes (in italics, followed by my own brief editorial notes).

  • Anonymity and privacy are not synonymous terms. The former is forfeited if one chooses to live in society. (Pg. 5) This would be a surprise to many of our Founding Fathers who anonymously published the Federalist Papers, one of the foundational documents of our new constitutional government. Also isn’t it exactly what we expect when we venture outside—to be largely unknown? Forfeiting that means giving up an awful lot.
  • In both one‐to‐one verification and one‐to‐many identification applications biometrics merely provides an identity result for the questions “are you who you claim to be?” or “who are you?” These results do not necessarily diminish privacy or profile a person. (Pg. 5) It’s hard to imagine how identifying a person who was previously anonymous does anything but diminish privacy—a literally faceless watcher now knows who I am. That can be used to invade my privacy in many ways, and of course also may become the basis of a profile.
  • The facial template itself, like other biometric templates, provides no personal information. (Pg. 5) This is disengenuous because of course the facial template is key to identifying an individual. It is like saying that a social security number contains no personal information.
  • Surveillance is already a part of our daily life, thanks to the digital age and tremendous increases in computational power. Facial recognition does not increase its use. (Pg. 8) Arguing that a powerful new surveillance technology doesn’t increase surveillance is silly. But the broader argument is very dangerous: that we just need to accept that we are already constantly surveilled and get over it. The technical tools to create a surveillance society already exist. If we accept that they are the new normal, we subvert the only things that can prevent mass spying: our fundamental values and norms.
  • Under either class of common security surveillance video technology, it isn’t practical or possible to conceive of a “face stalking” application that can be accessed and run across all the video cameras in a surveillance system. Stalking, although thankfully infrequent, occurred before the advent of facial recognition technology, and unfortunately will continue to occur, whether facial recognition becomes a factor or not. (Pg. 9) This ignores the fact that a stalker could use their own camera to take a picture of a potential victim and then use face recognition technology to identify him or her. Think about what you could do with unregulated Google glasses.
  • I will end with a quote that I don’t even understand: As we face new social, political, security, and economic challenges in the 21st century, it is fitting that identity assurance, the underpinnings of individual and collective security, benefit from biometric identity technologies that reflect the uniqueness of the men, women and children living in societies we strive to create and improve upon. (Pg. 6) Somehow I don’t think this is the value of uniqueness that most Americans cherish.

This document represents just one industry view, but it’s a powerful one because these are the people who are currently making and marketing these technologies. Other industry stakeholders, like Google and Facebook, have taken different, more responsible positions. The question is whether they want this to be the ”industry position.” The question for the rest of us is, if this is how the watchers want to treat us, what are we going to do about it?

View comments (1)
Read the Terms of Use

John David Galt

The real problem here is the increasingly concentrated ownership of Internet facilities and service providers, which is already leading to surveillance as well as censorship. But the right answer is not to impose "net neutrality" rules, which would play right into the hands of the bad guys by turning the Net into the same kind of overregulated, progress-resistant mess that the telephone network has been since about WW2.

A better idea, I believe, would be to impose rules that would have the effect of breaking up the eight or so Big Media empires which now control the Net. For instance, how about making it illegal for any company that is paid for data transport (whether of the Internet or of things like cable TV channels) to also collect royalties for content it produces? Separate those two functions and you'll guarantee that each remains open to plenty of new entrants, which is exactly what we consumers need.

Stay Informed