Opposition to Face Recognition Software in Airports Due to Ineffectiveness and Privacy Concerns

Document Date: July 26, 2002

The terrorist attacks of September 11 have led airports and other institutions to look for new ways of improving security, many of which the American Civil Liberties Union supports.

However, we believe that the use of intrusive new surveillance technologies must be subjected to a process of thoughtful and deliberate scrutiny. In particular, a technology’s intrusiveness must be balanced against the security benefits it would bring. The burden is on the technologists to demonstrate that their solutions will actually be effective in making us safer.

One solution that is under consideration at a number of airports is to conduct widespread video surveillance of airport patrons and to use “face recognition” technology – dubbed “Ferret” by the Department of Defense – in an effort to check the identity of passengers. But it is abundantly clear that the security benefits of such an approach would be minimal to non-existent, for a very simple reason: the technology doesn’t work.

Several government agencies have abandoned facial-recognition systems after finding they did not work as advertised, including the Immigration and Naturalization Service, which experimented with using the technology to identify people in cars at the Mexico-US border.

Nonetheless, officials at Logan Airport in Boston and T.F. Green airport in Providence, Rhode Island, have announced that they will be installing the technology. The ACLU has urged officials in these airports to reconsider their plans.

Anyone who claims that facial recognition technology is an effective law enforcement tool is probably working for the one of the companies trying to sell it to the government. Facial recognition software is easily tripped up by changes in hairstyle or facial hair, by aging, weight gain or loss, and by simple disguises. A study by the Department of Defense found very high error rates even under ideal conditions, where the subject is staring directly into the camera under bright lights. The study found very high rates of both “false positives” (wrongly matching people with photos of others) and “false negatives” (not catching people in the database). That suggests that if installed in airports, these systems would miss a high proportion of suspects included in the photo database, and flag huge numbers of innocent people – thereby lessening vigilance, wasting precious manpower resources, and creating a false sense of security.

Facial recognition technology carries the danger that its use will evolve into a widespread tool for spying on American citizens as they move about in public places. If the technology promised a significant increase in protection against terrorism, it would be important to evaluate its dangers and benefits in depth. But that conversation is beside the point when face recognition has been shown to be so unreliable as to be useless for important security applications.

Face-recognition at the airport offers us neither order nor liberty.

Sign up to be the first to hear about how to take action.