Letter to Arizona School Officials on School Face Recognition

December 17, 2003

Tom Horne
Arizona Superintendent of Public Instruction
1535 W. Jefferson
Phoenix, AZ  85007

Make a Difference

Your support helps the ACLU defend privacy rights and a broad range of civil liberties.

Give Now

Thomas F. Reale, Jr.
Superintendent
Washington Elemenary School District
8610 N. 19th Avenue, Phoenix, AZ  85021

Kate McGee
President, Governing Board
Washington Elementary School District
8610 N. 19th Avenue, Phoenix, AZ  85021

Joe Arpaio
Maricopa County Sheriff
100 W. Washington, Suite 1900
Phoenix, AZ 85003

cc: Nedda Shafir, Director of Community Services
      Washington Elemenary School District
      8610 N. 19th Avenue, Phoenix, AZ  85021

 

Dear Mr. Horne, Mr. Reale, Mrs. McGee, and Mr. Arpaio,

We write to you to urge you to reconsider the decision to install a facial recognition video system at Royal Palm Middle School.  We are extremely dismayed and concerned about this particular decision and the seeming lack of consideration with which it was made.  

There are several reasons why we think this is a bad decision and a bad use of public funds. 

1. The technology does not work

First, the success rate of face recognition technology has been dismal, and among knowledgeable biometrics and security experts, its deployment in public places has largely been discredited.  In test after test, both in the lab and in the real world, this technology has been found lacking.  The many independent findings to that effect include a trial conducted by the U.S. military in 2002, which found that with a false-positive rate of around 1%, the technology had less than a 10% chance of successfully identifying a person in its database who appeared before the camera.  To achieve an accuracy rate that would correctly flag 50% of any ""bad guys"" who appear before the camera, the test found system would have to accept a false positive rate of approximately seventy percent.[1] 

Sheriff Arpaio has stated that when the system identifies a match, police would come to the school to investigate; note that even if we are unrealistically optimistic and assume a false positive rate of just 0.1%, or one in a thousand, the police will be called to the school many times a month because some parent, student, or visitor has set off the alarm.  

Even leaders of biometrics companies now acknowledge that the technology is ineffective when deployed in public places.  As the Boston Globe reported in September, 

The ACLU isn't alone in its doubts about the use of face scanning at airports. Viisage [a face-recognition software developer] president and CEO Bernard Bailey. . . said that he, too, is opposed to the idea, simply because the technology still isn't good enough.

""I don't think that's the best use of our technology,"" Bailey said. ""The hype of this technology got way ahead of the capabilities of it.""

Bailey said that the accuracy of airport facial scanning is hampered by the fact that scanning devices still have trouble coping with different lighting conditions and poses.  Even changing the position of a person's head can cause an inaccurate identification.[2]

3. The implications for privacy and freedom have not been sufficiently considered

This is a technology that has deep privacy implications that at a minimum should be the subject of a thorough and wide-ranging debate before any use.  If its deployment became widespread, as some supporters are calling for, it would create a network of checkpoints where the government scans the faces of Americans as they go about their own business - without their permission, and quite probably without their knowledge - and runs their images through various databases of wrongdoers.  Such a network could potential extend to the thousands of video cameras that are already blanketing our public spaces to a greater and greater degree.  That would mean a frightening change to life in America, where we have always prided ourselves on our freedom to go about our lives without being tracked by the government unless we're suspected of wrongdoing.  The fact that the technology works so poorly makes it more frightening, not less, because it means that many innocent people will would be targeted for doing nothing more than showing their face in public. We believe that Americans do not want to travel further down the road toward a total surveillance society.

4. Schools are not a safe place to set up police checkpoints

Finally, even if we were to pretend that it is a good idea to set up face-recognition checkpoints in America, and if we were to pretend that the technology worked reliably, the question must be asked:  do we really want our schools to be the place where police try to apprehend ""bad guys,"" some of whom may resist arrest in desperate and violent ways?  We do not want shootouts in our schools.  And we don't want to poison the atmosphere of our schools with a highly invasive and erratic technology; innocent parents and visitors are not likely to take kindly to being stopped and interrogated by police officers about their identity and background based on the erratic output of a computer program or a passing resemblance to a sex offender from some faraway state.  

In short, the deployment of face recognition software in our schools is a bad idea.  It is an assault of the privacy of your students, staff and visitors and will not protect our children.  It indicates a  profound lack of familiarity with the technology and its limitations. And at a time when the state of Arizona is facing an enormous budgetary shortfall, it would drain precious educational and law enforcement resources away from more pressing needs.  We urge you not to use face-recognition at Royal Palm Middle School.

Sincerely,

Eleanor Eisenberg,                            Barry Steinhardt
Executive Director                            Director
Arizona Civil Liberties Union             Technology and Liberty Program
                                                         American Civil Liberties Union



 

[1] See /cpredirect/14881; see the slide entitled ""Phase 2 Results: False Alarm and Correct Alarm Rates.""

[2] Shelley Murphy and Hiawatha Bray, ""Face recognition devices failed in test at Logan,"" Boston Globe, Sept. 3, 2003.  See also Jonathan Sidener, ""Biometrics' once-promising future as a terrorist detector fades after setbacks, San Diego Union Tribune, Sept. 29, 2003: 

Supporters of biometrics draw a distinction between face-recognition surveillance of a moving crowd and use of the technology with a single, stationary subject in a setting where the lighting is controlled. 

 ""It's a problem if you use it in a part of the airport where people are on the move and there's different lighting in different areas,"" said Bill Willis, senior vice president for technology at ImageWare Systems, a San Diego imaging and biometrics company. 

Statistics image