Amazon’s Face Recognition Falsely Matched 28 Members of Congress With Mugshots

Amazon’s face surveillance technology is the target of growing opposition nationwide, and today, there are 28 more causes for concern. In a test the ACLU recently conducted of the facial recognition tool, called “Rekognition,” the software incorrectly matched 28 members of Congress, identifying them as other people who have been arrested for a crime. 

The members of Congress who were falsely matched with the mugshot database we used in the test include Republicans and Democrats, men and women, and legislators of all ages, from all across the country.

Amazon Rekognition False Matches of 28 member of Congress
Our test used Amazon Rekognition to compare images of members of Congress with a database of mugshots. The results included 28 incorrect matches. 

The false matches were disproportionately of people of color, including six members of the Congressional Black Caucus, among them civil rights legend Rep. John Lewis (D-Ga.). These results demonstrate why Congress should join the ACLU in calling for a moratorium on law enforcement use of face surveillance.

To conduct our test, we used the exact same facial recognition system that Amazon offers to the public, which anyone could use to scan for matches between images of faces. And running the entire test cost us $12.33 — less than a large pizza.

Tell Amazon to get out of the surveillance business

Using Rekognition, we built a face database and search tool using 25,000 publicly available arrest photos. Then we searched that database against public photos of every current member of the House and Senate. We used the default match settings that Amazon sets for Rekognition.

The Rekognition Scan, Comparing input images to mugshot databases
Rep. Sanford Bishop (D-Ga.) was falsely identified by Amazon Rekognition as someone who had been arrested for a crime. 

In a recent letter to Amazon CEO Jeff Bezos, the Congressional Black Caucus expressed concern about the “profound negative unintended consequences” face surveillance could have for Black people, undocumented immigrants, and protesters. Academic research has also already shown that face recognition is less accurate for darker-skinned faces and women. Our results validate this concern: Nearly 40 percent of Rekognition’s false matches in our test were of people of color, even though they make up only 20 percent of Congress.

Racial Bias in Amazon Face Recognition
People of color were disproportionately falsely matched in our test.

If law enforcement is using Amazon Rekognition, it’s not hard to imagine a police officer getting a “match” indicating that a person has a previous concealed-weapon arrest, biasing the officer before an encounter even begins. Or an individual getting a knock on the door from law enforcement, and being questioned or having their home searched, based on a false identification.

An identification — whether accurate or not — could cost people their freedom or even their lives. People of color are already disproportionately harmed by police practices, and it’s easy to see how Rekognition could exacerbate that. A recent incident in San Francisco provides a disturbing illustration of that risk. Police stopped a car, handcuffed an elderly Black woman and forced her to kneel at gunpoint — all because an automatic license plate reader improperly identified her car as a stolen vehicle.

Matching people against arrest photos is not a hypothetical exercise. Amazon is aggressively marketing its face surveillance technology to police, boasting that its service can identify up to 100 faces in a single image, track people in real time through surveillance cameras, and scan footage from body cameras. A sheriff’s department in Oregon has already started using Amazon Rekognition to compare people’s faces against a mugshot database, without any public debate.

Face surveillance also threatens to chill First Amendment-protected activity like engaging in protest or practicing religion, and it can be used to subject immigrants to further abuse from the government.

These dangers are why Amazon employees, shareholders, a coalition of nearly 70 civil rights groups, over 400 members of the academic community, and more than 150,000 members of the public have already spoken up to demand that Amazon stop providing face surveillance to the government.

Congress must take these threats seriously, hit the brakes, and enact a moratorium on law enforcement use of face recognition. This technology shouldn’t be used until the harms are fully considered and all necessary steps are taken to prevent them from harming vulnerable communities.

List of Members of Congress Falsely Matched With Arrest Photos

Senate

  • John Isakson (R-Georgia)
  • Edward Markey (D-Massachusetts)
  • Pat Roberts (R-Kansas)

House

  • Sanford Bishop (D-Georgia)
  • George Butterfield (D-North Carolina)
  • Lacy Clay (D-Missouri)
  • Mark DeSaulnier (D-California)
  • Adriano Espaillat (D-New York)
  • Ruben Gallego (D-Arizona)
  • Thomas Garrett (R-Virginia)
  • Greg Gianforte (R-Montana)
  • Jimmy Gomez (D-California)
  • Raúl Grijalva (D-Arizona)
  • Luis Gutiérrez (D-Illinois)
  • Steve Knight (R-California)
  • Leonard Lance (R-New Jersey)
  • John Lewis (D-Georgia)
  • Frank LoBiondo (R-New Jersey)
  • David Loebsack (D-Iowa)
  • David McKinley (R-West Virginia)
  • John Moolenaar (R-Michigan)
  • Tom Reed (R-New York)
  • Bobby Rush (D-Illinois)
  • Norma Torres (D-California)
  • Marc Veasey (D-Texas)
  • Brad Wenstrup (R-Ohio)
  • Steve Womack (R-Arkansas)
  • Lee Zeldin (R-New York)
View comments (123)
Read the Terms of Use

Joshua

Totally agree. The method used for this level of accusation should be made available. What was the input image resolution and quality of the false matches? What were the "Confidence Scores" associated with each "match?" I think the omission of this information is likely a result of data that does not strengthen the argument and should be considered. A quick perusal of Rekognition's FAQ section should beg a few questions about this article's credibility.

Sean Captain

Really good points, Ben. I'm a reporter, and if you'd like to provide more insight on this, I'd be happy to chat.

Anonymous

What was the percentage of people of color in the database of mugshots that you used?
That's a key piece of data that you did not include. -- no less important than the percentage of people of color among congress members.
It is well known that people arrested are disproportionally people of color (a big problem that we need to fight). Perhaps that is the source of the issue, and not the face recognition algorithms? Can't tell from the data you published. (I'm guessing it's both...)

Anonymous

Based on what I know about learning algorithms in general, the most likely cause is that the matching system was developed by feeding in data containing mostly white people. If they fed in a representative sample of the American population, it would be majority white, and if they grabbed it off social media it could be even more weighted.

The expected and predictable result of doing this is that the system will analyze images in a way that produces the best results on white faces. If the system is evaluated purely on total accuracy, this is desirable, but if used as a law-enforcement tool this will result in a higher rate of unjustified stops against minorities.

Due to reasons involving facial structure and reflectivity beyond my expertise, the best way of distingushing faces in images varies by ethnicity.

Anonymous

Machine Learning is a complex space.
Full test parameters and results are needed to understand what went wrong and how to improve.

Anonymous

Duubt this will stop it

Justin E.

I am not sure if this was the name matched but actually Gianforte has a mugshot from assaulting a reporter.
The technology is still flawed and a horrible idea.

Anonymous

This study should be run on EVERY high-level govt official AND every CEO and Executive of these companies promoting this.

Anonymous

Please ACLU share your database (or link) of the 25k criminal records and Congress photos, so that others can validate your findings. It would be unfortunate, if you are the only group able to reproduce this issue.

Anonymous

It's the Amazon tool that the government want's to use on citizen's database. They are saying that when the government uses this tool on it's citizen, it will be off of that database. The ACLU didn't just pick a 25k criminal records database. They are replicating what would happen when the government uses this tool.

Pages

Stay Informed