document

Statement of Concern About Predictive Policing by ACLU and 16 Civil Rights Privacy, Racial Justice, and Technology Organizations

Document Date: August 31, 2016

On August 31, 2016, a coalition of 17 organizations issued the following statement about predictive policing tools used by law enforcement in the United States, pointing to the technology’s racial biases, lack of transparency, and other deep flaws that lead to injustice, particularly for people of color.

Predictive Policing Today: A Shared Statement of Civil Rights Concerns

August 31, 2016

A growing number of police departments across the United States are deploying new computer systems that use data in an attempt to automatically forecast where crime will happen or who will be involved. Today, these “predictive policing” tools are used primarily to further concentrate enforcement activities in communities that are already over-policed, rather than to meet human needs.

The institution of American policing, into which these systems are being introduced, is profoundly flawed: it is systemically biased against communities of color and allows unconscionable abuses of police power. Predictive policing tools threaten to provide a misleading and undeserved imprimatur of impartiality for an institution that desperately needs fundamental change. Systems that are engineered to support the status quo have no place in American policing. The data driving predictive enforcement activities — such as the location and timing of previously reported crimes, or patterns of community- and officer-initiated 911 calls — is profoundly limited and biased.

Decades of criminology research have shown that crime reports and other statistics gathered by the police primarily document law enforcement’s response to the reports they receive and situations they encounter, rather than providing a consistent or complete record of all the crimes that occur. Vendors who sell and departments who embrace these new tools are failing to account for these realities, or to evaluate whether the data is so flawed that it cannot be relied upon at all. As a result, current systems reinforce bias and sanitize injustice.

Automated predictions based on such biased data — although they may seem objective or neutral — will further intensify unwarranted discrepancies in enforcement. Because of the complexity and secrecy of these tools, police and communities currently have limited capacity to assess the risks of biased data or faulty prediction systems.

Even within a broken criminal justice system, there are places where data can be a force for good: For example, data can identify people with mental illness for treatment rather than punishment, or provide early warning of harmful patterns of officer behavior. However, today, most “predictive policing” is not used for such constructive interventions. Instead, it concentrates existing law enforcement tactics, and will intensify stringent enforcement in communities of color that already face disproportionate law enforcement scrutiny.

We believe that:

1. A lack of transparency about predictive policing systems prevents a meaningful, well-informed public debate. Whenever automated predictions are considered for policing, all stakeholders must understand what data is being used, what the system aims to predict, the design of the algorithm that creates the predictions, how predictions will be used in practice, and what relevant factors are not being measured or analyzed. The natural tendency to rush to adopt new technologies should be resisted until a true understanding is reached as to their short and long term effects. Vendors must provide transparency, and the police and other users of these systems must fully and publicly inform public officials, civil society, community stakeholders, and the broader public on each of these points. Vendors must be subject to in-depth, independent, and ongoing scrutiny of their techniques, goals, and performance. Today, instead, many departments are rolling out these tools with little if any public input, and often, little if any disclosure. Vendors are shrouding their products in secrecy, and even seeking gag clauses or asking departments to pledge to spend officer time resisting relevant public records requests, as a precondition for trying out their products. These practices must stop. Claims of trade secrecy or business confidentiality must not be allowed to override the public’s interest in transparency. Transparency is necessary, but not by itself sufficient: A thorough and well-informed public debate, and rigorous, independent, expert assessment of the statistical validity and operational impact of any new system, are essential before any new system can be deployed at scale. Continuous assessment is vital so long as the system is in use.

2. Predictive policing systems ignore community needs. Most predictive policing systems fielded today focus narrowly on the reported crime rate. Other vital goals of policing, such as building community trust, eliminating the use of excessive force, and reducing other coercive tactics, are currently not measured and not accounted for by these systems. As a result, current systems are blind to their impact in these areas, and may do unnoticed harm. Policing should be equitable across racial and geographic lines. This requires measuring and tracking all uses of coercive authority and the demographics of the people involved.

3. Predictive policing systems threaten to undermine the constitutional rights of individuals. The Fourth Amendment forbids police from stopping someone without reasonable suspicion — a specific, individualized determination that is more than just a hunch. Computer-driven hunches are no exception to this rule, and a computer’s judgment is never a further reason (beyond the articulable facts that intelligibly caused that judgment) for a stop, search, or arrest. Similarly, predictive policing must not be allowed to erode rights of due process and equal protection. Systems that manufacture unexplained “threat” assessments have no valid place in constitutional policing.

4. Predictive technologies are primarily being used to intensify enforcement, rather than to meet human needs. Social services interventions can help to address problems for at-risk individuals and communities before crimes occur. Communities that invest in predictive technologies should consider whether and how these systems could be used to more effectively allocate social service resources, including educational opportunities, job training, and health services, taking into account the privacy interests of communities and the limits of available data. As the President’s Task Force on 21st Century Policing noted, “the justice system alone cannot solve many of the underlying conditions that give rise to crime. It will be through partnerships across sectors and at every level of government that we will find the effective and legitimate long-term solutions to ensuring public safety.”

5. Police could use predictive tools to anticipate which officers might engage in misconduct, but most departments have not done so. Early experiences from Chicago and elsewhere show that police misconduct follows consistent patterns, and that offering further training and support to officers who are at risk can help to avert problems. Police should be at least as eager to pilot new, data-driven approaches in the search for misconduct as they are in the search for crime, particularly given that interventions designed to reduce the chances of misconduct do not themselves pose risk to life and limb.

6. Predictive policing systems are failing to monitor their racial impact. Systems that are currently deployed, or are contemplated for future deployment, must each be publicly audited and monitored on an ongoing basis for their disparate impact on different communities the police department serves, with results broken out by race and by neighborhood. And those disparities must be addressed.

Signatories:

The Leadership Conference on Civil and Human Rights
18 Million Rising
American Civil Liberties Union
Brennan Center for Justice
Center for Democracy & Technology
Center for Media Justice
Color of Change
Data & Society Research Institute
Demand Progress
Electronic Frontier Foundation
Free Press
Media Mobilizing Project
NAACP
National Hispanic Media Coalition
Open MIC (Open Media and Information Companies Initiative)
Open Technology Institute at New America
Public Knowledge

Every month, you'll receive regular roundups of the most important civil rights and civil liberties developments. Remember: a well-informed citizenry is the best defense against tyranny.