Big Data Could Set Insurance Premiums. Minorities Could Pay the Price.

Are you a registered Democrat? You could be more likely to experience anxiety these days, causing you to need more mental health care. Have you lived in neighborhoods near industrial zones? That could increase your chance of chronic illness. Do you buy video or board games? You might be less likely to exercise, raising your medical costs in the long term.

According to an investigation that ProPublica and NPR released on Tuesday, health insurers have begun acquiring huge amounts of non-health-related data about the people they insure or will potentially insure. This data includes race, net worth, consumer behavior, criminal and civil court records, and prior addresses, among other things. Health insurers buy it from data brokers, who scoop up pretty much everything from the data trails we all leave behind as we move through the world. Those data brokers, as well as the health insurers themselves, also create algorithms to find relevant patterns in this data — like relationships between particular purchasing habits or life events and increased health care expenditures. 

While health insurers claim they’re not using these algorithms to set insurance costs for individuals, they’re unable to cite any law that would prevent them from doing just that. And considering that the very purpose of insurance is to assess risk and charge customers accordingly, there’s a very real concern that insurers will start using these algorithms to set their fees.

Existing health disparities mean that data will consistently show members of certain groups to be more likely to need more health care. What will happen, then, if this data starts being used against those groups? We know, for example, that Black women are much more likely to experience serious complications from pregnancy than white women. So, health insurers might conclude that a woman who is Black and recently married is likely to cost them more money than a white woman in the same position. Even in cases where they don’t have accurate race data, insurers might draw the same conclusion for women who purchase Black hair-care products or those who have tweeted about television shows like Atlanta or Scandal.

More broadly, people who live in poor neighborhoods and neighborhoods of color are much more likely to have health problems than those in affluent neighborhoods. The ProPublica piece quotes one health data vendor joking, “God forbid you live on the wrong street these days … You’re going to get lumped in with a lot of bad things.” Is it fair to make health care more expensive for people based on zip code or race?

The Affordable Care Act prohibits insurers from discriminating on the basis of pre-existing conditions or gender, but it doesn’t say anything about race, religion, national origin, or anything else insurers can learn about you from data brokers. At the state level, where insurance in this country is largely regulated, more than half of states don’t even ban using race explicitly in pricing health insurance. That’s a problem, especially in the age of big data, when it’s extremely tempting for insurers to raise prices for customers they perceive to be risky, sometimes in order to drive them away. Actors in other lines of insurance, like auto or homeowners’ insurance, have started to use digital data to raise prices for customers who they predict won’t switch insurers if their rates go up. It’s a big enough problem that 20 states have issued bulletins banning the practice.

Historical and ongoing racial discrimination has created an enormous racial wealth gap, and because we continue to live in such a segregated country, almost all the data held by data brokers reflects and encodes racial disparities. When predictive models are built using this data, people of color are consistently disadvantaged — Black people whose credit scores are as good or better than those of whites might not get a loan simply because of the neighborhood in which they live.

If that happens in the lending context, the federal Equal Credit Opportunity Act protects the borrower. When similar algorithmic discrimination occurs in the housing market, the Fair Housing Act provides protection, as does Title VII when there’s a job at issue. Since, in addition to barring intentional discrimination, each of these statutes prohibits neutral policies that nonetheless have a disparate impact on members of protected groups — like people of color — they are vital in the era of algorithmic decision-making. (Although the Trump Administration is doing its best to get rid of this crucial “disparate impact” standard.)

The ProPublica report shows that the danger of discrimination in insurance is increasingly real. But there’s a big hole in civil rights law when it comes to insurance. State legislatures should explore new ways to prevent discrimination in health insurance, including requirements that insurers audit their own use of consumer data for discriminatory effects and publish the results. Consumers deserve no less.

View comments (16)
Read the Terms of Use

Dr. Timothy Leary

They are always coming with some new way to screw you. The only way to escape it is to go live in the wilderness like a hippie, a racoon or something.

Anonymous M.C.

Just be careful not to get any tick-borne diseases while you're out there.

Anonymous

b s

Lin Peterson

I once worked for a data collection place. My job wasn’t collecting data but I was appalled at the data I saw collected. It was supposed to be without identifying names, how many people in a community had fireplaces, pools, etc. However, it is up to the integrity of the data collector whether to link names to data. This article is NOT BS. It is all too real.

York Hunt

I don’t see an issue with this, especially with car insurance. The black community always claim they get pulled over because the system is racist, but the reality is they’re by and large terrible drivers. As a result of reckless speeding, weaving in and out of lanes without signaling, and a much higher rate of hit and runs than other races, theycatch the eyes of law enforcement. It’s not always about race but about actions and choices.

Anonymous

Wow. Racist much?

Anonymous

obviously you have never driven a car while black.

Anonymous

What an incredibly racist comment! It is interesting to see how incredibly unlearned and myopic large swarths of the US population remain. I leave you with one thought: "When they came for the gypsies, I said nothing. When they came for the blacks, I said nothing. When they came for the Jews, I said nothing. When they came for me, there was no one left."

Anonymous

Oh wow. Is this person for real?

Anonymous

You are too bias, you should not generalize, I've seen that in all races.

Pages

Stay Informed