ACLU Opposes Use of Face Recognition Software in Airports, Citing Ineffectiveness and Privacy Concerns

October 5, 2001 12:00 am

Media Contact
125 Broad Street
18th Floor
New York, NY 10004
United States


NEW YORK — The American Civil Liberties Union today said it opposes plans to install facial recognition software in combination with video surveillance in U.S. airports because flaws in the technology make it ineffective and provide a false sense of security — a conclusion several federal agencies have already reached.

The ACLU also cited privacy concerns, but said that before legitimate questions about privacy invasions could even be considered, the effectiveness of the software must be carefully examined.

“Every new proposal for using a potentially intrusive technology should be subjected to a two-step analysis,” said Barry Steinhardt, Associate Director of the ACLU. “The first step is to evaluate whether a technology actually makes us safer. If it does, the second step would be to balance the actual increase in safety against the technology’s cost to our fundamental freedoms, including our privacy.”

“In the case of facial recognition technologies,” Steinhardt said, “it is abundantly clear that the current technology fails the first test, so there is really no need to get into deep analyses of the conflict between freedom and security posed by these systems. Because of its high failure rate, the technology cannot significantly increase our safety.”

Despite media reports that one airport has decided to install the facial recognition software, neither the government or the technology’s manufacturer has disclosed the location of the airport where the technology will be tested.

The specific proposed use of facial recognition involves using computers to scan an image — like those from security cameras at an airport — and then searching through a database of other pictures for matches. A number of the companies selling face-recognition products have made elaborate promises about their technology since the Sept. 11 attacks — claims that in some cases have contributed to a sharp run-up in their stocks, according to The Wall Street Journal.

“Anyone who claims that facial recognition technology is an effective law enforcement tool,” Steinhardt said, “is probably working for the one of the companies trying to sell it to the government.”

In fact, several government agencies have already tried and rejected as impractical similar uses of facial recognition software, including the Immigration and Naturalization Service, which tried a system to identify people in cars at the Mexico border.

“If the INS has rejected face scanning technology at its borders, which is arguably one of the places where we most need effective security, then it makes even less sense to start installing it in our nation’s airports,” Steinhardt said.

A study last year by the Department of Defense found major problems with “false positives,” in which the system reports a match when none in fact is there. Police relying on this technology will therefore be led too often to stop and question innocent people instead of suspects. See

Another recent study by the National Institute of Standards and Technology found that digital comparisons of posed photos of the same person taken 18 months apart triggered “false negatives” by computers 43 percent of the time. In other words, persons who should have been identified were not. See

“These studies were performed with images captured under circumstances far more ideal than a crowded, bustling airport,” Steinhardt said. “Installing this software in airports is only going to give people a false sense of security that we are able to catch potential terrorists when in fact the technology is simply not up to it.”

Experts in such systems say they can be relatively easy to fool by simply changing a hair or beard style or wearing glasses. Computers also have trouble recognizing the effects of aging; the software won’t recognize the person if more than a year or two has passed since the original picture in the database was taken.

Earlier this year, the ACLU expressed concerns after Florida officials deployed face-scanning technology at the Super Bowl and in the City of Tampa. The ACLU of Florida has called for public hearings on the use of security systems in Tampa and elsewhere.

Officials there have yet not responded to the ACLU’s demand that they demonstrate to the public the law enforcement value of the system — which, at the Super Bowl, reportedly came up with many “false positives”– and explain how the captured images will be used, stored, and disposed of.

Sign up to be the first to hear about how to take action.

Learn More About the Issues in This Press Release