ACLU Calls on Fresno Airport to Remove Controversial Facial Recognition Technology

November 20, 2001 12:00 am

Media Contact
125 Broad Street
18th Floor
New York, NY 10004
United States

FOR IMMEDIATE RELEASE

SAN FRANCISCO–A controversial facial recognition technology system installed at Fresno’s Yosemite Airport will do little to keep Americans safe, but threatens fundamental civil rights, the American Civil Liberties Union said today in a letter to the city officials.

“”The ACLU fully recognizes the need for improving airport security in the wake of the September 11 attacks,”” said Jayashri Srikantiah, a staff attorney with the ACLU of Northern California. “However, we also believe that the use of intrusive new surveillance technologies must be subjected to a process of thoughtful and deliberate scrutiny. The effectiveness of facial recognition technology is open to serious question and we believe that it will not enhance the security of the air-traveling public in any meaningful way.”

In letter send today to the Director of Transportation for the City of Fresno, the ACLU urged the City to “reconsider its decision to employ facial recognition on a trial basis, and to put on hold any efforts to implement the technology on a permanent basis.” The ACLU goes on to say that the use of the technology “will create a false sense of security while severely eroding fundamental privacy rights and likely increasing the harassment of innocent people based solely on their ethnic appearance.”

Several government agencies have abandoned facial-recognition systems after finding they did not work as advertised, including the Immigration and Naturalization Service, which experimented with using the technology to identify people in cars at the Mexico-U.S. border.

A study by the Department of Defense found very high error rates even under ideal conditions where the subject is staring directly into the camera under bright lights. If installed in airports, the technology would miss a high proportion of suspects included in the photo database, and flag huge numbers of innocent people — thereby lessening vigilance, wasting manpower resources, and creating a false sense of security.

Nonetheless, officials at Logan Airport in Boston and T.F. Green airport in Providence, Rhode Island, have announced that they will be installing the technology. The ACLU has urged officials in these airports to reconsider their plans.

More information on facial recognition technology can be found at the ACLU’s special online feature at http://archive.aclu.org/features/f110101a.html.

The letter follows:

November 20, 2001

Charles R. Hayes
Director of Transportation
Airport Administration
City of Fresno
4995 East Clinton Way
Fresno, CA 93727

Dear Mr. Hayes:

According to recent news reports and our conversations with Public Relations and Communications Director Patti Miller, Fresno Yosemite International Airport has installed a facial recognition system on a trial basis as an added security measure in light of the tragic events of September 11th. While the ACLU certainly understands Fresno Yosemite’s interest in increasing security at the airport, we are extremely dismayed and concerned about its decision to install a facial recognition system. Because facial recognition is a highly privacy-invasive technology, we believe its efficacy needs to be considered extremely carefully before it is deployed, whether on a trial or permanent basis. We believe that the effectiveness of facial recognition technology is open to serious question, and therefore, that airports should not be implementing the technology.

To begin with, facial recognition schemes are of little use without a photographic database of suspects. It is our understanding that the “terrorist database” Fresno Yosemite will be using is both rudimentary and tiny. While we recognize that the FBI and other federal agencies may be working on the database, it is unrealistic to think that it will ever contain the photographs of more than a small fraction of potential international terrorists. The database also could not include more than a tiny fraction of potential domestic terrorists – many of whom, like Timothy McVeigh, have no criminal records. It makes little sense to employ an intrusive system that will have little chance of success. The technology will not only divert resources from more effective efforts, but it will also create a false sense of security that will cause us to let our guard down.

In addition, studies by the National Institute of Standards and Technology (NIST) and the Department of Defense strongly suggest that facial recognition systems, even when tested in far more ideal conditions than those of a bustling airport, would miss a high proportion of suspects included in the photo database, while flagging many innocent people. The Department of Defense study, for instance, found major “false positive” problems, in which the system reported a match when none existed. Police and airport authorities relying on facial recognition systems will therefore be led too often to stop, question, and detain innocent people instead of suspects. If the photo database consists largely, if not exclusively, of Middle Eastern people flagged as terrorists, the result of these numerous “false positives” will fall most heavily on innocent people of Arabic or South Asian descent and lead to yet another pattern of racial profiling in law enforcement.

On the flip side, the NIST study found a “false negative” rate – which corresponds to when the technology failed to identify persons who should have been identified – of 43 percent when the technology was asked to compare current images with photographs of the same subjects taken just 18 months before. Independent experts agree, as the NIST study demonstrated, that facial recognition systems have trouble recognizing the effects of aging, and that changing hair or beard style or wearing glasses can fool systems. Differences in lighting and camera angles, as well as the fact that individuals are not posing for photos (but are instead being photographed surreptitiously as in Fresno Yosemite) are all known to further increase the inaccuracies of facial recognition systems.

In fact, several government agencies have abandoned facial recognition systems after finding that they did not work as advertised. For instance, the Immigration and Naturalization Service experimented with facial recognition technology to identify people in cars at the United States-Mexico border, but ultimately decided against using the technology.

We are concerned as well that there will be enormous pressure to use facial recognition to look for people suspected of non-terrorist activity, such as those with outstanding warrants from local jurisdictions, or even motorists with outstanding speeding tickets. If such an expanded use of facial recognition technology seems far-fetched, it is not. We are hard-pressed to think of a privacy-invasive technology instituted in our time which has been limited to its original use. Indeed, facial recognition technology was used to surreptitiously take the photos of every person attending the Super Bowl this year. Nobody was arrested as a result of this secret surveillance experiment that made every Super Bowl patron part of a giant police line-up. The “matches” found by the system appear to have been either “false positives” or of minor lawbreakers, none of whom were alleged to have done anything illegal during the game. Despite these serious problems with the technology, it was subsequently installed on Tampa’s public streets, where its use is even less justifiable than at the Super Bowl.

The hastiness of Fresno Yosemite’s decision to deploy facial recognition technology exacerbates our concerns about the use of the technology. According to our conversations with Ms. Miller, the airport did not undergo serious, formal deliberation before deciding to install facial recognition technology. Rather, the decision was made in an informal manner, and without public participation or debate. It appears that the vendor of the technology (who is providing it free of charge to Fresno Yosemite) will be deciding weighty questions such as who will have access to the database and what level of “match” should trigger an alarm.

We fully recognize that the right to privacy at airports is not absolute. The right must be balanced against the government and the public’s legitimate interest in safety when privacy-intrusive measures significantly promote security. But, we need not even reach that difficult balancing in this case, for there is simply no objective basis to believe that implementation of facial recognition technology at Fresno Yosemite will enhance the security of the air-traveling public in any meaningful way. Instead, its use will create a false sense of security while severely eroding fundamental privacy rights and likely increasing the harassment of innocent people based solely on their ethnic appearance.

For all of the reasons in this letter, we strongly urge Fresno Yosemite International Airport to reconsider its decision to employ facial recognition technology on a trial basis, and to put on hold any efforts to implement the technology on a permanent basis.

Thank you for your consideration of our views. Should you have additional questions or concerns, we would welcome the opportunity to meet with you and other appropriate officials so that we can present our views on facial recognition technology in more detail.

Sincerely,

Jayashri Srikantiah
ACLU of Northern California

Barry Steinhardt
American Civil Liberties Union

Sign up to be the first to hear about how to take action.

Learn More About the Issues in This Press Release