NEW YORK — Over 2,200 entities, including ICE, CBP, the FBI, and a number of local law enforcement agencies, have teamed up with Clearview AI to run faces against the company’s database of billions of photos, according to a company client list reviewed by Buzzfeed News. The list was reportedly obtained via a security flaw in Clearview’s platform.
Below is comment from Nathan Freed Wessler, staff attorney with the ACLU’s Speech, Privacy, and Technology Project, in response to the reported list of entities:
“This list, if confirmed, is a privacy, security, and civil liberties nightmare. Government agents should not be running our faces against a shadily assembled database of billions of our photos in secret and with no safeguards against abuse. It is also alarming that, despite the growing opposition to this technology, these agencies insist on using an error-prone and privacy-invading technology peddled by a company that can't even keep basic client information secure.
“States should follow New Jersey's lead and prohibit police from using Clearview's dangerous system. And lawmakers nationwide should immediately halt law enforcement use of face recognition technology, as communities nationwide have.
“This technology will end privacy as we know it, and must be stopped.”
In China, the government is already using face recognition surveillance to track and control ethnic minorities, including Uighurs. Protesters in Hong Kong have had to resort to wearing masks to protect themselves from government persecution.
There is also widespread evidence that face recognition technology is error-prone and biased, with error rates higher for faces of women and people with darker skin.
Recognizing these harms, states and cities across the nation have reined in law enforcement use of face recognition technology as part of ACLU-led efforts. Three California cities — San Francisco, Berkeley, and Oakland — as well as four Massachusetts municipalities — Springfield, Somerville, Northhampton, and Brookline — banned the government’s use of face recognition technology from their communities. The state of California blocked police body cam use of the technology. In New York City, tenants successfully fended off their landlord’s efforts to install face surveillance.
Similar efforts are currently under consideration in cities and states across the country, including in Detroit, Massachusetts, Washington, and California. Members of Congress have also expressed significant concerns with the technology, and have held a series of hearings to investigate its use.
Many of these city and state initiatives are part of the ACLU’s Community Control Over Police Surveillance (CCOPS) effort, which is designed to ensure residents — through their local governments and elected officials — are empowered to decide if and how surveillance technologies are used, and to promote government transparency with respect to surveillance technologies.