Back to News & Commentary

Facebook, Objectivity, and 'Intelligence Anxiety'

Computer brain and image of Facebook Trending page
Computer brain and image of Facebook Trending page
Jay Stanley,
Senior Policy Analyst,
ACLU Speech, Privacy, and Technology Project
Share This Page
May 17, 2016

I’ve been following with great interest the flap over Facebook’s alleged censoring of conservative-oriented news items from its influential “trending news” section. It started with an allegation in Gizmodo from a former employee that the social network overrode its own data and used discretion to place some stories in that section, while removing others, and that in so doing it had a bias against conservative material.

Obviously as many have pointed out this is a reminder about the enormous power that Facebook wields. One can make two complaints about Facebook. First, that it shouldn’t be biased. Personally, I’m ambivalent on that question. In the sense that the company is technically a publisher, it has the same right as any newspaper to pick and choose what it will publish, and to be liberal, conservative, or anything else. As a legal matter, Facebook almost certainly has such a right. Of course, insofar as it acts not as a news source but as a forum in which people communicate with each other, bias is more worrisome, if that means distorting the way that people connect and communicate. More broadly, that role as forum-host or platform is behind what I think is a larger public expectation that the company will be generally neutral and even-handed. But when push comes to shove, here it was operating a news feed not very different from what a newspaper does.

The second, more interesting complaint one could make about Facebook is that it has implicitly mislead its readers into believing that they are seeing an “objective” measurement of mass interest in various stories when they are not—letting people believe that Facebook is not expressing its own judgments about what stories to present, but holding up a mirror to its user base to show everybody what they are collectively interested in. When I see a list on the side of a newspaper site that says “most read” or “most shared,” I assume that’s a relatively dumb algorithm that is simply counting up clicks. I don’t know that Facebook ever explicitly claimed that the news section in question, labeled simply “Trending,” was the equivalent of that—it’s a fairly loose word—but it’s a natural assumption to make in this age of data and algorithms. If people believe they are seeing a picture of what the world looks like via dumb data, but are actually seeing a curated source, that’s a problem.

Last year in a blog post on “The Virtues of Dumbness” I wrote about how there are many situations where we actually want and expect decisionmaking processes to be dumb rather than smart. And as artificial intelligence creeps into more and more of the things around us, I argued that we will increasingly experience a condition I call “intelligence anxiety”—the worry that something that is supposed to be dumb and neutral actually has intelligence embedded within it, and is therefore capable of bias and manipulation. (If somebody has a better term, please let me know!) As I wrote, “the really bad things come when we think something is dumb but it’s not.” That appears to be exactly what has happened with Facebook’s “Trending” section. This would not be the political controversy it is if people didn’t think the story selection in that section was automatic and expect a certain kind of dumbness.

Of course, in that essay I was mainly talking about how computer intelligence is seeping into things. The intelligence here came from humans, and the dumbness (click counting) from computers, but it can go either way. Humans organized into bureaucracies can be dumb, and computers, as we see more every day, can be smart and subtle and manipulative. There is no such thing as an unbiased algorithm—but there are very simple algorithms (both computer and bureaucratic), and as long as they are dumb we feel we can predict and count upon the ways that they will affect the world.

The takeaway here is that as intelligence anxiety spreads, the pressure on companies like Facebook—as well as the government and everybody else—to be transparent will become greater than ever. Facebook was smart to react to this flap by trying to fully explain to everybody how their page works. Even assuming organizations make no false claims about the dumbness of certain algorithms, increasingly they will need to proactively disclose what exactly is and is not going on underneath the hood.

Learn More About the Issues on This Page