Naked Statue Reveals One Thing: Facebook Censorship Needs Better Appeals Process

We at the ACLU were reassured of one thing this past weekend: Facebook’s chest-recognition detectors are fully operational. A recent post of ours, highlighting my blog post about an attempt to censor controversial public art in Kansas, was itself deemed morally unfit for Facebook. The whole episode is a reminder that corporate censorship is bad policy and bad business.

The blog is about a kerfuffle over a statue in a public park outside Kansas City: a nude woman taking a selfie of her own exposed bronze breasts. A group of citizens organized by the American Family Association believes the statue to be criminally obscene (it isn’t), and has begun a petition process to haul the sculpture to court (really, they are). Our Facebook post included a link to the blog post and a photo of the statue in question.

Our intrepid Digital Media Associate, Rekha Arulanantham, got word on Sunday that the Facebook post had been deleted, and was no longer viewable by our Facebook followers or anyone else. I duly informed my Kansas colleague Holly Weatherford that the photograph she’d taken had prompted a social media blackout. Then, astoundingly, on Tuesday morning Rekha discovered the ACLU had been blocked from posting for 24 hours, with a message from Facebook warning us these were the consequences for repeat violations of its policy.

We were flabbergasted; we hadn’t tried to republish the offending post or the associated rack. So, just to get this straight: the ACLU’s post on censorship was shut down—not once, but twice—for including a picture of, and a political discussion about, a statue standing in a Kansas park.

Why Was Our Post about Censorship Censored?
Facebook’s notice told us that the post was removed because it “violates [Facebook’s] Community Standards.” While my blog did include a comprehensive slate of synonyms for “boobs,” it was the visual subject of the blog—the image of the statue itself—that triggered Facebook’s mammary patrol.

Look, we’re the ACLU. Of course our Facebook posts are going to touch on controversial subjects—if they didn’t, we just wouldn’t be doing our jobs. We won’t ever (apologies in advance) post gratuitous nudity—flesh or metal—online. Anything we post illustrates a broader point about our civil liberties. And sure enough, this particular naked statue did just that by serving as a touchstone for a conversation about community standards and censorship. Thousands of people read the blog and hundreds commented on Facebook, weighing in on the censorship controversy. That is, before Facebook removed the post. The irony here is pretty thick.

As we read Facebook’s Community Standards, our busty statue pic was A-OK. Facebook is generally strict about human nudity, but the “Nudity and Pornography” standards also have a caveat:

Facebook has a strict policy against the sharing of pornographic content and any explicitly sexual content where a minor is involved. We also impose limitations on the display of nudity. We aspire to respect people’s right to share content of personal importance, whether those are photos of a sculpture like Michelangelo's David or family photos of a child breastfeeding.

The sculpture Holly snapped isn’t just of personal importance to her and other Kansans, it’s now of political importance too. And while art critics may or may not deem this particular bronze “a sculpture like Michelangelo’s David,” that’s precisely the analogy I used in my original blog post. The statue is at the swirling center of a community fight that implicates the First Amendment, obscenity, and even the proper use of the criminal justice system. The statue’s image belongs on Facebook, not only because it is of personal and political importance, isn’t obscene, and doesn’t violate community standards—but also because the statue is newsworthy. And Facebook should work hard to keep newsworthy content out of the censor’s crosshairs.

The Facebook Censors are Fallible
We decided to appeal Facebook’s determination that our blog post didn’t fit within community standards, just like any user might. And… we immediately hit a brick wall. The takedown notice informed us an ACLU post had been removed, but didn’t exactly invite a conversation about it:

There was no “appeal” button, and we were unable to find a page where we could report or challenge the post’s deletion. The best option appeared to be a generic Facebook content form, designed to receive any input at all about a “Page.” We got a response: a canned email informing us that Facebook “can’t respond to individual feedback emails.” Not exactly promising.

But we have an advantage most Facebook users don’t: We’re a national non-profit with media access and a public profile. So we tracked down Facebook’s public policy manager, and emailed him about our dilemma. His team was immediately responsive, looked into it promptly, and told us that the post was “mistakenly removed” (and then “accidentally removed again”). Here’s what Facebook wrote to us:

We apologize for this error. Unfortunately, with more than a billion users and the hundreds of thousands of reports we process each week, we occasionally make a mistake. We hope that we've rectified the mistake to your satisfaction.

Facebook then restored the original post.

It’s certainly reassuring that Facebook agrees our original post shouldn’t have come under fire and was not a violation of the Community Standards. Unfortunately, the post was unavailable all weekend as we scrambled to figure out how to bring the mistaken deletion to Facebook’s attention. That’s a big hit in the fast-paced social media world.

More unfortunately, our ultimate success is cold comfort for anyone who has a harder time getting their emails returned than does the ACLU. It’s unlikely that our experience is representative of the average aggrieved Facebook user. For most, that generic form and the canned response are as good as it’s currently going to get.

My colleague Jay Stanley has highlighted the dangers of corporate censorship before here on the pages of Free Future. He argues that as the digital world steadily eclipses the soap box as our most frequent forum for speech, companies like Facebook are gaining government-like power to enforce societal norms on massive swaths of people and content. A business primer from our colleagues in California illustrates how heavy-handed censorship is as bad a choice in business as it is in government. Fortunately, Facebook is generally receptive to these arguments. With Facebook’s mission to “make the world more open and connected,” the company is clearly mindful of the importance of safeguarding free speech.

But like all censors, its decisions can seem arbitrary, and it also just makes mistakes. If Facebook is going to play censor, it’s absolutely vital that the company figure out a way to provide a transparent mechanism for handling appeals. That’s particularly true when censorship occurs, as it so frequently does, in response to objections submitted by a third-party. A complaint-driven review procedure creates a very real risk that perfectly acceptable content (like…you know, images of public art) will be triggered for removal based on the vocal objections of a disgruntled minority. A meaningful appeals process is, therefore, beyond due.

More fundamentally, this incident underscores why Facebook’s initial response to content should always err on the side of leaving it up, even when it might offend. After all, one person’s offensive bronze breast is also one of Kansas’ biggest current media stories.

That a bronze sculpture in a public park in Kansas ran afoul of the nudity police shows that Facebook’s censors could use some calibration. And when they misfire, as they did here, there must be a process in place to remove the muzzle.

View comments (19)
Read the Terms of Use


The irony is that it is just as hard to get a legitimately foul page deleted or blocked, not all graphically violent content will auto filter. The page that comes to mind shows deceased persons in crime scene photos young and old.
the following is a link perhaps you can use you connections to have this material reviewed by a human being.


That isn't the ONLY thing that gets censored. If you go to the NRA's page and write that you're a gunshot victim, say that you have nothing wrong with the RIGHT people having guns, but tell your story about how background check repulsion made it perfectly legal for the guy to own all the guns, including the one he shot you with three times, they'll censor the comment by deleting it.
B/c we can't have people knowing that a gun is dangerous in the wrong hands, now CAN we?
That has nothing to do with the Second Amendment. I heard that the amendments weren't created to protect criminals and the guy who shot me was a criminal who DIDN'T steal his gun. He bought it at a gun show undetected b/c people refuse to be responsible about background checks, which incidentally have nothing to do with violating Second Amendment rights. The only people who would FAIL a background check are already criminals.
I lost every bit of respect I had for the NRA the minute they agreed that Mr. Pierre was a worthy representative for them. It's only been this bad since he became their mouthpiece.
Oh. And they DIDN'T censor what Stephen King said to them.
I don't care if he has more money than I do, and probably a parade of legal counsel following him. I don't think he's more important as a human being than I am just b/c he has attorneys. If he can make a comment against them, why can't I, who has at least had her entire life and happiness compromised b/c of the resulting disability that gunshot created?


Welcome to the exact same battle that breastfeeding mothers and advocates have been fighting with Facebook for a number of years now. Facebook's response, when one can be elicited, is ALWAYS, "Oops, sorry for our occasional mistakes."

The "mistakes" are far from occasional. Facebook is set up to empower anonymous cyberbullying, with limited recourse for victims who can't (like the ACLU did) get the right guy on the phone or doesn't (like Nestle, which has actually used breastfeeding photos to advertise its infant formulas) have big advertising bucks to throw around.

Censorship of obviously non-pornographic images of breasts and breastfeeding is part of what keeps breastfeeding rates low, which harms children, mothers, and public health. This too is a political issue involving reproductive justice and the status and treatment of women.


While I don't agree with their actions I don't see how this is an ACLU matter, can't facebook censor whatever they want based on any criteria you've agreed to by signing their usage agreement?

Meanwhile in VA a boy was expelled from school and a few others suspended after a neighbor complained of him playing with an airsoft gun in his own front yard with other children, in view of the bus stop 70 yards away. The decision was based on the schools zero-tolerance policy for firearms in school, (yes the event was not during school or on school property).

I'm thinking being punished by an institution for what you do in your own front yard is a lot more important w.r.t. civil liberties than facebook censoring one of their guest's (bound by usage agreement) photos in facebook's own front yard.

David Hogan - A...

Good article, but if you are going to go after Facebook for "playing" censor how about giving equal attention the lawyers that bring lawsuits agains tmedia outlets for "allowing" a post to exist. There are many such as the ex-bengal cheeleader that won a defamation lawsuit among many others.

If you are the real champion of liberties (and I believe you are) then help social media to allow the free speech without the lawsuits that often come...

The process you went through was difficult, but you did get your post restored.

Anyone can bitch and force others to effect a change, how about helping that become a reality? Seriously, sit down with the ambulance chasers and judges and lawmakers and figure this out....or just bitch.


they are also bad about upholding their hate speech me and several hundreds of people filed complaints about witches must die by fire page and another similar page. It several hundred wiccans, pagans, witches, what ever you want to call us. But facebook refused to block the page cause it was Christians, that were doing the hate speech. I and couple of others just to prove a point made copy pages using the exact same words and replaced witch with Christian and or Jew the pages were banned in about a day and half. But it took few weeks, several web posts, petitions ect. to get facebook to follow their policy on hate speech against people or religious groups. The employees refused to recognize witches as religious group or people. The employees saw no problem with people threating and saying all witches should die in fire. They had no problem with them clearly stating that every witch should die, be killed and burned alive! I could not believe it when they said it didn't violate their policy! Freedom of religion is one thing, but to allow them to promote the that people should be burned is another. But it took couple of weeks before they took the page down and about every wiccan and pagan group banning together to get heard. So if u ever talk to that dude again inform him that he needs to teach his employees to enforce and protect all people and religions on facebook and enforce the rules for all. Cause that page should of been down just as quick as the all Christians should die by fire page that only took a day and half to get shut down! Both pages had exact same words only replaced witch with Christian even on the page it self copied straight from their page word for word and even stated the purpose of the page was to show how facebook considered it ok to burn witches but not Christians and every thing else was exact. but yes the system is flawed and needs more interaction for compliants and for the pages that are complaied about.


Excellent work as usual, thank you. Might I suggest that using the word "rack" for the statue's anatomy does your message a slight disservice.

Laura Wilson

I'm the HMUA and assistant to Ashlee Wells Jackson for the 4th Trimester Bodies Project and we've face very similar issues to yours with no recourse on how to solve the problem.
Ashlee has had her personal FB account, her daughters', and her separate businesses all blocked for 30 days. The 4th Trimester Bodies Project FB page has been unpublished and her instagram account has been deleted all because someone is flagging our non-community-standards-violating photos.
We've started a petition and a new FB page called "Bring Back the 4th Trimester Bodies Project." But what else can be done? Corporations to not get the right to censor my body if it doesn't violate their Terms of Service.


Lucky for you that you have the clout you do and the means to contact a real person at FB. Many others are not so lucky. Here's one that cannot have even been considered to really violate any FB standards:


Well this had to happen to the ACLU didn't it.

I really have mixed feelings about this situation for a number of reasons. Obviously the material was not of a nature that should have been censored. However, we should never forget that even though they shouldn't, they absolutely can censor anything they wish. They are literally private property where the owner has the absolute right to control what is posted, transported and otherwise conveyed and displayed. Even though I am an ardent Constitutional rights supported I am also the owner of many different web properties. Ultimately the site owner has to protect itself and its brand.

The other issue of no appeal process is a huge problem with Facebook. In fact the lack of an appeal process is just a small part of the greater NO support issue that plagues the site.

In the end if users get fed up with this poor corporate behavior they will leave.


Stay Informed