Amazon’s Face Recognition Falsely Matched 28 Members of Congress With Mugshots

Amazon’s face surveillance technology is the target of growing opposition nationwide, and today, there are 28 more causes for concern. In a test the ACLU recently conducted of the facial recognition tool, called “Rekognition,” the software incorrectly matched 28 members of Congress, identifying them as other people who have been arrested for a crime. 

The members of Congress who were falsely matched with the mugshot database we used in the test include Republicans and Democrats, men and women, and legislators of all ages, from all across the country.

Amazon Rekognition False Matches of 28 member of Congress
Our test used Amazon Rekognition to compare images of members of Congress with a database of mugshots. The results included 28 incorrect matches. 

The false matches were disproportionately of people of color, including six members of the Congressional Black Caucus, among them civil rights legend Rep. John Lewis (D-Ga.). These results demonstrate why Congress should join the ACLU in calling for a moratorium on law enforcement use of face surveillance.

To conduct our test, we used the exact same facial recognition system that Amazon offers to the public, which anyone could use to scan for matches between images of faces. And running the entire test cost us $12.33 — less than a large pizza.

Tell Amazon to get out of the surveillance business

Using Rekognition, we built a face database and search tool using 25,000 publicly available arrest photos. Then we searched that database against public photos of every current member of the House and Senate. We used the default match settings that Amazon sets for Rekognition.

The Rekognition Scan, Comparing input images to mugshot databases
Rep. Sanford Bishop (D-Ga.) was falsely identified by Amazon Rekognition as someone who had been arrested for a crime. 

In a recent letter to Amazon CEO Jeff Bezos, the Congressional Black Caucus expressed concern about the “profound negative unintended consequences” face surveillance could have for Black people, undocumented immigrants, and protesters. Academic research has also already shown that face recognition is less accurate for darker-skinned faces and women. Our results validate this concern: Nearly 40 percent of Rekognition’s false matches in our test were of people of color, even though they make up only 20 percent of Congress.

Racial Bias in Amazon Face Recognition
People of color were disproportionately falsely matched in our test.

If law enforcement is using Amazon Rekognition, it’s not hard to imagine a police officer getting a “match” indicating that a person has a previous concealed-weapon arrest, biasing the officer before an encounter even begins. Or an individual getting a knock on the door from law enforcement, and being questioned or having their home searched, based on a false identification.

An identification — whether accurate or not — could cost people their freedom or even their lives. People of color are already disproportionately harmed by police practices, and it’s easy to see how Rekognition could exacerbate that. A recent incident in San Francisco provides a disturbing illustration of that risk. Police stopped a car, handcuffed an elderly Black woman and forced her to kneel at gunpoint — all because an automatic license plate reader improperly identified her car as a stolen vehicle.

Matching people against arrest photos is not a hypothetical exercise. Amazon is aggressively marketing its face surveillance technology to police, boasting that its service can identify up to 100 faces in a single image, track people in real time through surveillance cameras, and scan footage from body cameras. A sheriff’s department in Oregon has already started using Amazon Rekognition to compare people’s faces against a mugshot database, without any public debate.

Face surveillance also threatens to chill First Amendment-protected activity like engaging in protest or practicing religion, and it can be used to subject immigrants to further abuse from the government.

These dangers are why Amazon employees, shareholders, a coalition of nearly 70 civil rights groups, over 400 members of the academic community, and more than 150,000 members of the public have already spoken up to demand that Amazon stop providing face surveillance to the government.

Congress must take these threats seriously, hit the brakes, and enact a moratorium on law enforcement use of face recognition. This technology shouldn’t be used until the harms are fully considered and all necessary steps are taken to prevent them from harming vulnerable communities.

List of Members of Congress Falsely Matched With Arrest Photos

Senate

  • John Isakson (R-Georgia)
  • Edward Markey (D-Massachusetts)
  • Pat Roberts (R-Kansas)

House

  • Sanford Bishop (D-Georgia)
  • George Butterfield (D-North Carolina)
  • Lacy Clay (D-Missouri)
  • Mark DeSaulnier (D-California)
  • Adriano Espaillat (D-New York)
  • Ruben Gallego (D-Arizona)
  • Thomas Garrett (R-Virginia)
  • Greg Gianforte (R-Montana)
  • Jimmy Gomez (D-California)
  • Raúl Grijalva (D-Arizona)
  • Luis Gutiérrez (D-Illinois)
  • Steve Knight (R-California)
  • Leonard Lance (R-New Jersey)
  • John Lewis (D-Georgia)
  • Frank LoBiondo (R-New Jersey)
  • David Loebsack (D-Iowa)
  • David McKinley (R-West Virginia)
  • John Moolenaar (R-Michigan)
  • Tom Reed (R-New York)
  • Bobby Rush (D-Illinois)
  • Norma Torres (D-California)
  • Marc Veasey (D-Texas)
  • Brad Wenstrup (R-Ohio)
  • Steve Womack (R-Arkansas)
  • Lee Zeldin (R-New York)
View comments (124)
Read the Terms of Use

Zachary

As one in Law Enforcement, who oversees a DNA & Biometrics Unit in my department in CA, I have made numerous presentations on the use of DNA & Biometrics for Law Enforcement. My expertise in this field is extensive. First, Facial Recognition is highly unreliable. It is why 95% of the agencies in CA do not use it. Facial Recog can lead to "No Hits' simply by wearing contact lenses. The best agencies can do, is state they have "candidates" of a potential hit. FR was designed to be used after an event. Thus, if there is crime scene video or photos, those could be compared against an agency's mugshot system for potential hits. What is more notably, FR, along with Iris and Retina scanning, is being pushed more by the vendors, not by the LE agencies; Morphotrust, Safran, Orbethur (all now Idemia), Cogent/Gemalto (now owned by Thales) and many others. The issue with Amazon and some other tech companies is they are pushing these softwares/services, along with their own cloud services with no regard to Federal, NIST and state legislative requirements with regard to criminal information. The Sheriff in Oregon comes to mind when he announced 400,000 photos went to Amazon's Rekogition system. Did he ensure those photos/records were not juvenile, seal records, purged arrests? That would be the first thing I'd review if I was the ACLU. What continues to fascinate me is the lack of focus on the what is really happening. Those companies I mentioned above, should also include Identix, who merged with L1 Solutions, bought by MorphoTract, merged with MorphoTrust (who bought GE's biometric division and others). Morpho is acquired by Safran (owned by a French hedge fund and a French bank), who then merge with Orbethur to become Idemia. Cogent was sold to 3M, sold off to Gemalto (make of mobile sim cards), and Gemalto just acquired by Thales. My point is, 80% of all biometric systems; fingerprints, facial, retina, iris, including many US state DMV systems are now owned by (2) French multi-national corporations, who note in their stock prospectus literature, they plan to upload all the data into their corporate clouds with no regard to US Federal legislative or state legislative mandates regarding such info under CORI guidelines, let alone privacy laws. Ha, and we are worried about terrorist. Another aspect is Social Media has opened the door to Law Enforcement successfully accessing devices (cell phones) without a warrant (even after US Sup Court ruled warrants required). In one case, the judge allowed the non-warranted access when the prosecution successfully argued that the individual had over 60 apps on the phone, with full permissions open to the apps, so there was not expectation of privacy. As FR & prints become the norm to unlock phones, the privacy aspects get blurred. Anyway, there is my two cents worth.

Anonymous

Falsely-Matched or swept under the rug?

Anonymous

Only fools assume a computer is 100% accurate. It's a tool for finding *possible* matches. Once you find a possible match, a human should take the next step.

Humans also mistakenly identify people. The prisons are full of people who were identified by witnesses of another race, which studies show can introduce erroneous results. You're fighting the wrong battle.

William Brennan

This technology should be used only as a first step in identification to allow for rapid searching. A human should be required to personally compare the images that are supposed to match before any action can be taken on the basis of it. That would resolve most of the issues raised here.

Pages

Stay Informed