Your Favorite Website Might Be Discriminating Against You

It can be pretty convenient when Facebook processes the gargantuan amount of personal data it has on you to show you ads for the precise lemon curd recipe you never knew you were craving. 

But what are the harms associated with this kind of targeting? It’s hard to answer that question — because an overbroad law actually prohibits the kind of studies best positioned to figure it out.

The implications go far beyond dessert — studies have shown that people are being treated differently online based on their race, actual or perceived. Websites have been found to use demographic data to raise or lower prices, show different advertisements, or steer people to different content. 

The consequences are real. Big data has resulted in people of color, or people who live in communities of color, paying more for car insurance and being more likely to see ads for predatory loans.

One recent study by Harvard computer scientist Latanya Sweeney found that searches for names typically associated with Black people were more likely to bring up ads for criminal records. 

Screenshot of a Boston Globe article titled 'Racial bias alleged in Google's ad results'

Another study found that Google showed ads for higher-paying executive jobs to users it presumed to be men. 

Screenshot of a Washington Post article titled 'Google's algorithm shows prestigious job ads to men, but not women. Here's why that should worry you.'

These examples are likely just the tip of the iceberg — but there’s so much we don’t know because the algorithms that advertisers use to target internet users are secret, as are the detailed profiles that big data brokers amass on ordinary people (and then sell to those advertisers).

To make matters more complicated, a law called the Computer Fraud and Abuse Act makes it illegal to do much of the work that’s required to uncover discriminatory practices online. The CFAA creates severe civil and criminal penalties for people who violate websites’ “terms of service” — all the fine print you never actually read that websites make you agree to. Internet activist Aaron Swartz was charged under that law and faced decades in prison before he took his own life in 2013.

Aaron Swartz
Aaron Swartz

Many terms of service governing the use of a website prohibit the use of automated technology and the creation of fake user profiles, which are the kinds of methods researchers rely on to audit algorithms and publish their results. If they can’t do that kind of research, we can’t know whether the algorithms that deliver content are discriminating against us. 

Prison bars appearing over a researcher at a computer

Is algorithmic discrimination the redlining of the 21st century?

For much of the 20th century, the federal government refused to guarantee mortgages in communities where people of color lived, and banks refused to lend in these neighborhoods. These neighborhoods were literally outlined in red on maps, hence the term redlining. 

Map illustrating Manhattan neighborhoods that were redlined

Families of color still feel the impact of being denied mortgages for decades. In general, they have inherited less wealth because the absence of credit meant homes in these neighborhoods were worth less, and these homes continue to be worth less, contributing to the racial wealth gap. As a result, they have also been denied access to high quality schools, adequate transportation, employment, and environmentally safe neighborhoods.

The Fair Housing Act of 1968 made redlining, along with other forms of housing discrimination, illegal. To make sure landlords and real estate agents are following the law, fair housing testers of different races apply for housing, seeking to make sure that testers of color are treated the same as white testers.   

Black family in front of house with 'for sale' sign fading to white family with 'sold' sign

Just as offline fair housing testing is crucial for discovering housing discrimination, robust online testing will be necessary to ensure that these protections are extended to the internet, too. Government agencies should require websites — especially housing providers, employers, and lenders — to audit themselves to ensure that they are not discriminating and should encourage researchers to regularly test and monitor algorithms to be sure they’re treating everyone fairly. It is outrageous that the CFAA prohibits so much of this work. It’s also unnecessary — it’s entirely possible to permit civil rights testing without opening the door to fraud.

The ACLU is challenging the CFAA to ensure that researchers and journalists aren’t thwarted from pursuing valuable research and investigations to determine whether inequality is baked into the algorithms that increasingly govern our world. 

A white hand on a computer mouse side by side with a black hand on a computer mouse

The next generation of civil rights testing will need to happen online. Without this kind of research, we’ll have no way of knowing whether websites are selling or advertising goods and services in a way that exacerbates existing racial disparities. 

For more on the Computer Fraud and Abuse Act, read our Free Future post, "ACLU Challenges Computer Crimes Law That is Thwarting Research on Discrimination Online."

Add a comment (17)
Read the Terms of Use

Anonymous

I think you are confusing discrimination with marketing.

Anonymous

I think you are confused if you think that the second excludes the first. Marketing can be discriminatory (and I say this after nearly a decade in producing ads.)

HawkAtreides

That's horribly simplistic, and missing the entire point of the article. If "marketing" means assuming that a minority name is more likely to be associated with a criminal record, or that women are not interested in - or eligible for - executive positions, then it's indistinguishable from discrimination.

Anonymous

Problem is increasing use of interconnected big data to make decisions- marketing changesis just one of the symptoms of these underlying algorithms we can test. If marketing algorithms produce this behavior, what happens when a local police department wants to implement big data/ algorithmic practices into their work? Also, its certainly harmful if targeted marketing means certain persons don't receive ads about useful opportunities, like exec positions.

Anonymous

But marketing with bias can be discriminatory. If you are never made aware of a service, how can you buy it? That is how historical racism worked, you weren't allowed in "that store" you weren't shown "that house" therefore you never bought that service. Cigarettes and liquor companies both got called out for targeting certain races in the 80's. Racism is rarely a guy in a white sheet, it is sly and underhanded nowadays.

Anonymous

Read the article! You can't be that dense!

Anonymous

So you think theres a distinction between the two in capitalist society?

Manny

If you search black names, yea you probably will find arrest records because our country loves to arrest black people. I thought this was common knowledge? Google is showing the truth.

Anonymous

Read the article. It is referring to _sponsored_ search results - In other words, paid advertisements displayed in response to keywords selected by the advertiser. This is how Google makes money.

Anonymous

Maybe it's because there is more chance of your car getting stolen or damaged in black neighborhoods than white ones, don't you think?

Also Google doesn't know you're male, female, black, white, or asian... Their results are based on what you've searched before and what kinds of websites you visit. Obviously if men visit more websites related to law, business, and tech they'll get more prestigious job ads. You can't expect to get a Senior Engineering ad when all you've been searching for the past couple years has been "cute summer dresses", right?

Pages

Sign Up for Breaking News