Facebook’s Targeting System Can Divide Us on More Than Just Advertising

It’s heartening to see, in the wake of the Cambridge Analytica revelations, growing skepticism about how Facebook handles data and data privacy. But we should take this opportunity to ask the bigger, harder questions, too — questions about discrimination and division, and whether we want to live in a society where our consumer data profile determines our reality.

In the spring of 2016, a Facebook executive gave a presentation about the success of Facebook’s then-new “ethnic affinity” advertising categories. Facebook had grouped users as white, Black, or Latino based on what they had clicked, and this targeting had allowed the movie “Straight Outta Compton” to be marketed as two completely different films. For Black audiences, it was a deeply political biopic about the members of N.W.A. and their music, framed by contemporary reflections from Dr. Dre and Ice Cube. For white audiences, it was a scripted drama about gangsters, guns, and cops that barely mentioned the names of its real-life characters. From the perspective of Universal Pictures, this dual marketing had been wildly successful. “Straight Outta Compton” earned over $160 million at the U.S. box office.

When we saw this news in 2016, it immediately raised alarm bells about the effect of such categories on civil rights. We went straight to Facebook with our immediate concern: How was the company ensuring that ads for jobs, housing, and employment weren’t targeted by race, given that such targeting is illegal under the civil rights laws? Facebook didn’t have an answer. We worked with officials from the company for more than a year on solutions that, as it turned out, were not properly implemented. Facebook still makes it possible for advertisers to target based on categories closely linked to gender, family status, and disability, and the company has recently gotten sued for it.

8 Questions Members of Congress Should Ask Mark Zuckerberg

To make matters worse, the government is actively turning a blind eye. The New York Times reported on Thursday that, under Secretary Ben Carson, the federal Department of Housing and Urban Development dropped its investigation into whether Facebook’s ad targeting system violated the Fair Housing Act. That means that HUD, on the eve of the 50th anniversary of that law, is choosing to put its head in the sand rather than investigate whether civil rights laws have been broken.

It’s not illegal to market “Straight Outta Compton” differently based on race (as opposed to say, a housing or employment ad). Nonetheless, that tactic creates a distinction among people and treats them differently as a result. And these kinds of distinctions have real-world effects: Think about what it means to white teenagers to see a trailer with yet another image of criminal Black men, instead of hearing Dr. Dre reflect on police brutality in the 1980s and today.

Then magnify that effect hundreds and thousands of times. In today’s world, a huge proportion of the advertising and media that we see reaches us based on accumulated data about us. If ad targeting means that my family and yours hear and read about different movies and TV shows, will that make it impossible for America to have another cross-racial Roots moment? (In 1977, 130 million Americans watched at least part of the famous miniseries tracing a Black family’s journey from Africa to slavery to the present day.)

Targeting, of course, does enable advertisers — including the ACLU — to efficiently reach particular audiences with messages that are tailored to them, and that can sometimes be a good thing. But that doesn’t mean we shouldn’t acknowledge what’s lost with that efficiency: that people outside of the expected audiences won’t see these messages or know they exist.

Ad targeting can make the world look different to different people. Some find the web full of job ads for high-paying CEO jobs, while others see mostly ads for sneakers or payday loans. Our news also reaches us and our networks through ad targeting. How can this not have huge implications for our ability to exist in a cohesive society? How can we agree on the policies that should govern our world when there are no common reference points for what that world looks like?

It’s not just foreign interference and voter suppression campaigns that make this kind of targeting so dangerous for democracy.

View comments (5)
Read the Terms of Use


Probably the biggest danger is if the U.S. Supreme Court allows "searches" of online data without a probable cause warrant from a court. It's a huge danger from both government agencies and private entities because it allows anyone to be penalized WITHOUT your knowledge. Robbing you of legal standing.

When you are "unaware" that you were penalized for perfectly legal First Amendment exercises - it robs you of "legal standing" in court. You can never challenge and correct this system in court.

For example: maybe you made a polite and thoughtful comment about "police body cameras". As taxpayers and employers of your local police officers, you have every right to, it's neither wrong nor illegal. A local police officer trolling social media may strongly disagree with you. Some officers may track your cell phone and subject you to extra enforcement (speeding ticket, etc). Although it is a federal crime under Title 18 US Code 242 and a 14th Amendment violation - which the officer swore to uphold - the DOJ prosecutors generally don't like to police the police. You may never know why you received the speeding ticket and guy driving faster than you didn't.

Freedom of Speech was designed by the Founding Fathers to affect the democratic process. When citizens legally exercise their First Amendment rights they can also unknowingly be competing with "corporate lobbyists" that pay lots of money to get the legislation they want passed. Sometimes you can unknowingly make corporate enemies by doing nothing wrong or illegal. Example: signing a petition for more jury trials and funding Public Defenders could affect the profits of private prisons, etc.

The U.S. Supreme Court should require "Probable Cause warrants" to allow online searches that could harm Americans. There is no effective watchdog to protect online users.

Dr. Timothy Leary

Facebook is for suckers. Are you a sucker ?

Mike Chase

This piece raises such important questions. The business model of Facebook is built essentially on “profiling” people, in the service of targeted advertising.


What about newsfeeds like Yahoo's that use similar programs to decide which articles to send to people's feeds? Those are dangerous, too, as they can result in people receiving only articles slanted either right or left or not receive ng articles about important issues at all. That is dangerous for Democracy as, if you never read an opposing views you may not realize there are other aspects of it. If you never hear about an issue at all you don't get the opportunity to take any action on it. That probably influences people more than advertising as a good majority of adults realze advertising is slanted or ignore it completely. Most adults also know how to find out about things like apartments for rent even if they never see an ad about it on Facebook.


You make a great point, with "no common reference points" how can we achieve a cohesive society?
I grew up in a mixed racial and ethnic home in upper middle class suburbia of Boston. I've come to realize that their is no "cohesive experience" for us to learn from. Many times i've witnessed firsthand racial experiences with my white relatives and family members. They can be sympathetic or even have empathy, but what we both saw and how we interpret it is completely different.
Growing up, i have had this experience numerous times. People see what they want to see. You can show the same ad to everybody, but they are all going to interpret it differently. it's frustrating because as a middle aged African-American male, I don't understand why the people I've shared the most in common with, have so many differing points of view and political opinions.

Stay Informed