Facebook Can’t Clean Up Ad Discrimination on Its Own

This piece was originally published at NBC's THINK.

Facebook has admitted a serious problem with the platform’s advertising function that is allowing racial discrimination on its site. But there is a way to fix it — if the company is willing.

In the spring of 2016, Facebook rolled out its “ethnic affinity” feature, which allowed advertisers to target Facebook users labeled as African American, Latino, or Asian American based upon their behavior on Facebook. Advertisers could opt to include or to exclude users in these categories. Facebook said that these labels were not equivalent to race because they were based not on users’ actual racial identities, but on whether they engaged with Facebook pages associated with those racial communities. Nonetheless, it identified the categories as “demographics” in its options for advertisers.

The system made it easy to exclude users marked as African American from seeing ads for anything, including job postings and credit or housing opportunities. Yet civil rights laws like the Fair Housing Act make this kind of discriminatory advertising illegal.

In October 2016, ProPublica was able to place a housing-related ad that targeted house hunters and those likely to move, excluding users marked as African American, Asian American, or Hispanic. The story prompted an immediate outcry. The Congressional Black Caucus contacted Facebook, and the Department of Housing and Urban Development, which enforces fair housing laws, said the revelations raised “serious concerns.”

Make no mistake, this is not simply an advertising problem — this is a civil rights problem made all the more dangerous by social media’s technological advances. Online personalization opens up significant possibilities for discrimination against marginalized communities, including people of color and other members of protected classes. In the offline world, we have thankfully moved past the era of housing advertisements that explicitly stated that people of certain races, religions, or ethnicities could not apply. But with behavioral targeting online, discrimination no longer requires that kind of explicit statement. Instead, a property manager can simply display ads for housing only to white people, or Christians, or those without disabilities.

In response to ProPublica’s 2016 investigation, Facebook expressed a desire to solve this discrimination problem built into its ad targeting business. We at the ACLU and other advocates spent many hours helping the company move toward some fixes. We helped Facebook settle on a system that we were told would use machine learning to detect ads for housing, credit, or employment and treat them differently. In those categories, Facebook promised, ethnic targeting would be impossible and advertisers would have to certify that they were not violating the law or Facebook’s anti-discrimination policy before their ads would run.

Discrimination in the rental market is one of the most toxic forms of contemporary discrimination. 

At the time, we praised Facebook’s changes — although they left some significant questions unanswered — hoping acknowledgment of the civil rights laws would become standard throughout the online advertising ecosystem.

Fast forward to last week. ProPublica tested the system again and found that Facebook was still allowing advertisers to prevent users from seeing an ad for rental housing based on race, now rechristened “multicultural affinity.” ProPublica was also able to exclude people in wheelchairs and Spanish speakers, among others.

We have been extremely disappointed to see these significant failures in Facebook’s system for identifying and preventing illegal advertising discrimination. Facebook’s representations to us over the course of the last year indicated that this problem had been substantially solved, but it now seems clear that was not the case. Discrimination in the rental housing market is one of the most toxic and tenacious forms of contemporary discrimination. And, as the recent Pulitzer Prize-winning book “Evicted” demonstrates, discrimination in rental housing is a key driver of poverty and inequality in this country.

In a statement, Facebook blamed “technical failure” for ProPublica’s recent findings and continues to express a desire to get this right. The company now says that all advertisers who want to exclude groups of users from seeing their ads — and not just those advertising housing, credit and employment — will have to certify that they are complying with anti-discrimination laws. And, just yesterday, the company announced that it would temporarily turn off all advertisers’ ability to exclude users by race while it continues to work on these problems.

Still, this story makes the need for greater transparency and accountability from these online platforms that much more urgent. Discrimination in the virtual world is no less damaging than offline discrimination, but it can be even more difficult to root out. People who are excluded from viewing Facebook advertisements, for example, would never know that the housing opportunity existed, and so they would not apply. 

It is the most vulnerable Americans who tend to lose the most from predatory business practices.

What’s worse, unlike in the pre-digital world, where organized communities and advocates could spot discriminatory ads and report them, it is impossible for someone who didn’t see a relevant housing ad to prove that discriminatory targeting is responsible. And of course, it is the most vulnerable Americans who tend to lose the most from predatory business practices.

The good news is that there is a way to find these issues: audit testing by academic researchers. Had Facebook allowed outside researchers to see the system it had created to catch discriminatory ads, those researchers could have spotted the problems and ended the mechanism for discrimination sooner.

There are whole communities of researchers ready, willing and able to conduct the audits that could help protect the public from some of the digital platforms’ most pernicious effects on civil rights. Having learned that it is not well-equipped to police its own systems, Facebook should commit to allowing independent audits. Justice, fairness and civil rights laws demand no less.

View comments (10)
Read the Terms of Use

Dr. Joseph Goebbels

We would not have all these problems if we would just make racial profiling legal.
"Ethnic affinity" = Racial profiling.


I wish the white people who fought in Vietnam, Korea, Iraq and Afghanistan would have won. Instead, they lost each war. It started with the Elvis music lovers, they lost Korea. Then the hippies lost Vietnam, yes you were all hippies. Then the Pearl Jam gen initially won Iraq but early withdrew, not early enough, there was a baby. Then all of them lost Afghanistan.

Thank you for your service, losers. I like winners!


Ethnic affinity=nigger inclined




Facebook needs to stop handing over subscriber data and content to the government.
It’s essentially a surveillance tool. What this means is that other countries and their citizens are exposed and at-risk of US targeting which includes converting them into informants, blackmail, extortion, exploitation and intellectual property and data theft.

In other words, anyone using Facebook is at risk being. Other countries and their governments cannot protect their own citizens privacy, national sovereignty, and individual security.

This is why all Facebook data needs to be stored in their respective countries with their own laws, and own agencies.

Imagine, if you’re a rich oligarch from Poland and you’re about to sign a deal with Russia over an oil pipelines; suddenly your sons personal texts and pictures show up on the dark web, leaked by “contractor” employee with access to Facebook data. That employee also happens to work for a private firm that owns majority shares in a competing firm in Poland, see the ethical and moral conflict.

Anyways, this is the world we live in now.


Simple solution, Don’t Use Facebook. Point blank. Facebook is a service you voluntarily utilize. You are not required to use by any law or standard. Yes all your “friends” might be using it, but if all your friends jumped off a bridge would you?

Don’t be stupid people, Facebook and any company you VOLUNTARILY give information too may use it for corporate means. Which means use it to make money. You think Suckerberg created Facebook to be a nice guy that facilitates a free communications platform between peers? Hell no! He’s worth billions off the backs of you dumb ass lackey users. Haha! Dumbasses. Congress can’t save you.


Facebook doesn't check your ID when you sign up. You can sign up under a completely fake, but believable, name and, if no one reports you, Facebook would never give your account a second glance. My account does have my real name on it but the rest of the information is blank or false because I see no reason to give Zuckerberg all that info. You can also choose not to like pages and etc if you are paranoid about the Russians finding out you like puppies or something. In short, if Russia ends up with info on you that you didn't want them to have its because YOU weren't careful about what information YOU put on the world wide web.


To above...

They track your IP and cross reference it with ISP providers, so they have your info dude.


I see some of you get on here and just blatantly use inflammatory, and hateful words. It truly shows your intelligence...zero. I guess when you are ignorant an uneducated, and people of other ethnicities surpass your ignorant azz, you should be angry....at yourselves. Take responsibility for yourselves instead of scapegoating others because of your inferior complex.


Thanks for the butt reaming nigga!

Stay Informed