Back to News & Commentary

The Brave New World of Discrimination

White Picket Fence
White Picket Fence
Dennis Parker,
Former Director,
ACLU Racial Justice Program
Share This Page
September 9, 2015

In recent years, the Internet has played an important role in the continuing battle against discrimination. Because of the ability to share information instantly and broadly, discriminatory abuses and practices have now received an unprecedented level of public exposure in social media. Out of this exposure has sprung movements, such as #blacklivesmatter and other grass roots organizations, that have brought a fierce new urgency to the struggle against discrimination and renewed the prospect of meaningful change in a time of often overwhelmingly negative news.

But alongside the potential for bringing about social progress, the Internet also holds the possibility of contributing to unlawful discrimination. An example of this potential negative impact is a patent recently acquired by Facebook that could conceivably permit loan servicers to gain access to the credit ratings of a loan applicant’s social network and then use that information to determine whether the applicant qualifies for a loan. The patent combines the possibility of serious invasions of privacy with the realistic prospect of illegal lending discrimination.

It isn’t hard to conceive of how this could harm, if not ruin, lives. For example, a responsible individual with a good credit rating may find herself penalized by the credit ratings of individuals in her social network, which might factor in her neighbor’s credit scores. Given the long, proven correlation between individual and neighborhood-level credit scores and race, such an approach could functionally exclude creditworthy people of color from receiving fair credit simply because they know or live in the vicinity of other people of color who, because of the history of discrimination in lending, themselves may have lower credit scores. The possibility of this happening is not far-fetched, as we have explained to the Federal Trade Commission, and would only further entrench racial segregation and other forms of financial discrimination.

The fact that this outcome would be the result of the operation of a computer algorithm rather than the result of intentional bias is both legally and factually irrelevant. Both the Equal Credit Opportunity Act and the Fair Housing Act, the two laws most likely to be implicated by unfair denials of credit, recognize that policies that have an unfair impact may be illegal regardless of intent or lack of intent. And that is as it should be. It would be little consolation to a person denied a home mortgage or a business loan because of her online friends that the denial was the result of a computer program rather than malice. That person would still be denied the better educational and employment opportunities that she sought by purchasing a new home or the possibility of lifting her family into a more financially stable position through opening a new business.

Such a denial would be contrary to the core intent of laws designed to assure fairness and equality, which urge treating people as individuals rather than judging them because they are a member of a group. No matter how hardworking and financially careful individuals are, the actions of their online friends or their neighbors could scuttle all of their dreams. And most disturbing of all, it is likely that they would never know the reason why.

Although the patent has not yet been and may never be used, it harbors the potential to further damage the financial well-being of people still feeling the harsh effects of earlier financial discrimination. It would be wise and fair for anyone considering implementing this patent to keep this harmful impact in mind and for financial regulators to monitor the use of the program carefully.

And beyond its own applications, the patent highlights the larger potential for algorithms and big data to perpetuate discrimination. Recent studies, such as the one finding that Google ads for high-paying jobs were more likely to show up for men than women, show that even if algorithms are not explicitly designed to discriminate, they can have negative effects on groups who have historically been discriminated against. Particularly when it comes to access to employment, housing, education, and credit —areas that the civil rights laws have recognized as particularly important — we must be vigilant to ensure that automated decision-making does not replicate existing societal discrimination.

Learn More About the Issues on This Page