Letter

Letter from ACLU of Rhode Island to T.F. Green Airport in Providence

Document Date: October 29, 2001

ACLU Says Plans to Install Facial Recognition Technology
In Rhode Island Airport Will Not Improve Safety FOR IMMEDIATE RELEASE
Monday, October 29, 2001

PROVIDENCE, R.I.–In a letter sent today to the Chair of the Rhode Island Airport Corporation, the American Civil Liberties Union called on officials to reconsider their to decision to install facial recognition technology at T.F. Green Airport here.

“”Federal government studies of this technology have shown that it does not work,”” said Steven Brown, Executive Director of the ACLU of Rhode Island. “”The only people who stand to benefit from the installation of facial recognition technology are the companies that are selling it.””

In the letter, Brown said that the technology “will do little else than create a false sense of security while severely eroding fundamental privacy rights and likely increasing the harassment of innocent persons based solely on their ethnic appearance.”

The four-page letter points out that the effectiveness of the technology is far below the highly inflated claims of the manufacturers of the equipment. The letter cites government reports showing that the technology creates numerous “false positives” (wrongly matching people with photos of others) and “false negatives” (not catching people in the database because the technology can be fooled by such things as changes in the person’s hair style, lighting, camera angles and aging of the person).The letter notes that the Immigration and Naturalization Service abandoned experiments with the technology because of its inaccuracy.

The ACLU letter also raised concerns that, although purportedly being implemented to catch terrorists, the technology would inevitably be expanded to monitor and target a growing number of other types of people.

The right to privacy, Brown said, “”must be balanced against the government and the public’s legitimate interest in safety when privacy-intrusive measures can significantly promote security. But one need not even get to that difficult balancing process in this case, for there is simply no objective basis to believe that implementation of FERET at T.F. Green Airport will enhance the security of the air-traveling public in any meaningful way.”

A copy of the ACLU’s letter follows.

October 29, 2001
E. Colby Cameron
Chairperson
R.I. Airport Corporation
T.F. Green Airport
2000 Post Road
Warwick, RI 02886

Dear Mr. Cameron:

I am writing in response to news reports this weekend indicating that RIAC has decided to purchase and use face recognition technology (FERET)* at T.F. Green Airport as an added security measure in light of the tragic events of September 11th. While the ACLU certainly understands RIAC’s interest in increasing security at the airport, we are extremely dismayed and concerned about the substance of this particular decision, as well as the seeming haste in which it has been made.

Since September 11th, both the state and airport officials have instituted numerous new security measures at T.F. Green with which the ACLU has had no occasion to disagree. However, because facial recognition is such a highly privacy-invasive technology, we believe its efficacy needs to be considered extremely carefully before it is deployed. Because its effectiveness is open to serious question and will affect many innocent travelers, we do not believe the state should be implementing such intrusive technology.

First, of course, face recognition schemes are of no use unless there is a properly established database of suspects. It is our understanding that no photographic database of terrorists presently exists. While we recognize that the FBI and other federal agencies may be working on such a database, it is premature for RIAC to be considering purchase and implementation of this technology before that has even been accomplished.

As for FERET’s effectiveness, studies by the government’s National Institute of Standards and Technology (NIST) and by the Department of Defense strongly suggest that these systems, even when tested in far more ideal conditions than exist at a bustling airport, would miss a high proportion of suspects included in the photo database and would flag huge numbers of innocent people. As a consequence, limited manpower and monetary resources would be unnecessarily diverted to a system that will only create a false sense of security.

In fact, several government agencies have abandoned facial-recognition systems after finding that they did not work as advertised. This includes the Immigration and Naturalization Service, which experimented with using FERET to identify people in cars at the Mexico-U.S. border. If the INS has rejected FERET’s use at our borders, where we arguably most need effective security, it makes even less sense to install it at an airport like T.F. Green.

The DoD study on FERET technology found major “false positive” problems, in which the system reports a match when none exists. Police relying on this technology will therefore be led too often to stop, question and detain innocent people instead of suspects. And if the photo database consists largely, if not exclusively, of Middle Eastern people flagged as terrorists, the result of these numerous “false positives” will fall most heavily on innocent people of Arabic descent and lead to yet another level of racial profiling in law enforcement.

On the flip side, the NIST study also found that digital comparisons of posed photos of the same person taken 18 months apart triggered “false negatives” by the technology 43 percent of the time. In other words, persons who should have been identified were not. Further, both these studies were performed with images captured under circumstances far more ideal than a crowded airport. Independent experts agree, as the NIST study demonstrated, that FERET has trouble recognizing the effects of aging, and that changing a hair or beard style or wearing glasses can also fool the computers. In addition, differences in lighting and camera angles, as well as the “uncooperative” nature of the person being photographed, all are known to further increase the inaccuracies of this technology.

It is thus quite shocking to read the chief of the Airport Police Department being quoted in the Providence Journal as claiming that these systems are at least 85 percent accurate. If RIAC was relying on statistics like those in making its decision to purchase the technology, it was greatly mistaken. Figures like this come only from the hucksterism of companies seeking to profit from the sale of these systems, not from the technology’s real-life use.

Further, despite the similarly-inflated claims of some FERET manufacturers that the technology could have prevented the September 11th attacks, it appears that only two of the hijacking terrorists were on the federal government’s “watch list.” Thus, the vast majority of them would have easily evaded review even with a 100 percent accurate system. As one critic of the technology has noted, people who know how to foil the system will be good at doing so, while it is the innocent who will be pulled into the dragnet because some of their features match those of someone else in the database.

The newspaper reports describing RIAC’s decision to implement FERET as quickly as a month from now also sound warning bells about both RIAC’s lack of consideration of basic questions regarding the system’s use, and about the inevitably slippery slope of this technology. On the latter point, Mr. Cheston, RIAC’s executive director, is quoted in the Journal as saying: “We’re looking for the 25 top terrorists.” But he then proceeded to add, “We potentially may be looking for people with outstanding warrants from local jurisdictions. We potentially may be looking for major drug kingpins. That’s a no-brainer.” Thus, even before RIAC’s decision to use the technology for the purported goal of preventing terrorism was old news, a bubble has been floated about using the technology for uses much broader than catching terrorists. Of course, if using the system to catch drug kingpins is a “no-brainer,” then why not add to the database the photograph of any person who has a criminal record or is suspected of a crime?

In the news article, Mr. Cheston apparently draws the line at using FERET to find people with speeding tickets. However, if he is interested in looking for “people with outstanding warrants from local jurisdictions,” speeders who have failed to show up in court to pay their ticket would fit comfortably in his proposed criteria. Not coincidentally, only two weeks ago the Registry of Motor Vehicles announced plans to begin digitizing driver’s license photographs, thus making photos of every driver in the state compatible for use with a FERET database. Once the technology is implemented at T.F. Green, the pressures will be enormous to use it for other law-enforcement related purposes. In fact, now that the DMV is obligated to collect drivers’ Social Security Numbers in order to ferret out parents lagging in their child support, it will surely not be long before someone suggests using the Registry’s photo database to catch “deadbeat dads” at the airport.

If the expanded use of FERET in these ways seems far-fetched, it is not. Frankly, one would be hard-pressed to think of a privacy-invasive technology instituted in our time whose use has remained limited to its original purpose. Indeed, it was Viisage Technology — the company from which RIAC proposes to purchase FERET — that, with the cooperation of the Tampa, Florida police, surreptitiously took the photos of every person attending the Super Bowl this year and ran them through a large criminal photo database. Nobody was arrested as a result of this secret surveillance experiment that made every Super Bowl patron part of a giant police line-up. The experiment did, however, allegedly produce 19 “matches.” Many of them, however, appear to have been “false positives,” and the others were of such lawbreakers as pickpockets and ticket scalpers, none of whom were alleged to have done anything illegal during the game.

The hastiness of RIAC’s decision to deploy FERET only heightens those concerns. No consideration appears to have been given in advance to such weighty questions as who will be in the database and who will be making that decision. Also, what assurances are there that all the “no-match” photographs taken by the system will be destroyed? Who will have access to the database? How will the decision be made as to how sensitive to make the machine so as to trigger a “match”? And so on.

Of course, we fully recognize that the right to privacy is far from absolute. That right must be balanced against the government and the public’s legitimate interest in safety when privacy-intrusive measures can significantly promote security. But one need not even get to that difficult balancing process in this case, for there is simply no objective basis to believe that implementation of FERET at T.F. Green Airport will enhance the security of the air-traveling public in any meaningful way. Instead, its use will do little else than create a false sense of security while severely eroding fundamental privacy rights and likely increasing the harassment of innocent persons based solely on their ethnic appearance.

For all these reasons, we strongly urge the Corporation to reconsider its decision and to put on hold any efforts to implement FERET technology at this time.

Thank you in advance for your consideration of our views. As the Corporation Board further considers this matter, we would be happy to share with you additional information about this issue.

Sincerely,

Steven Brown
Executive Director

cc: RIAC Board Members
Michael Cheston

* FERET is the acronym used by the federal government for its program evaluating FacE REcognition Technology. For purposes of economy, I use it in this letter to refer to such technology generally.

Every month, you'll receive regular roundups of the most important civil rights and civil liberties developments. Remember: a well-informed citizenry is the best defense against tyranny.