The Internet has evolved into a true marketplace for every idea – if you can think of it, you can find it on the web. That the online world has blossomed into this virtual town square teeming with diverse content is no accident. It is largely a creation of federal law – specifically, Section 230 of the Communications Decency Act of 1998. Section 230 is directly responsible for the free, messy, uncensored, and often brilliant culture of online speech. By prohibiting most state civil or criminal liability for something somebody else writes or posts, it created the single most important legal protection that exists for websites, bloggers, and other internet users. Under Section 230, a website can provide a platform for all speech without worrying that if one of its online users posts something stupid, critical, defamatory, or unlawful, the website itself can be held responsible.
What does this mean for the web as we know it? Almost everything. It means that Yelp can't be held legally responsible for a negative restaurant review written by one of its users. It means Craigslist doesn't have to screen every personal ad to make sure it isn't a cleverly-disguised prostitution pitch. It means that Reddit could, and did, offer its users a thread tracking the manhunt for the Boston Marathon bombers in real time without censoring users' reports from the BPD scanner. In short, Section 230 makes sure that any website that offers individuals a place to speak — comment threads, group forums, consumer reviews, political meet-ups, you name it — doesn't have to police its users to make sure every post is within the letter of state and federal law.
But last week a group of state attorneys general wrote a letter to Congress asking to change all that — and their misguided proposal threatens to undermine the legal regime that has allowed speech to flourish. The AGs have asked Congress to amend Section 230 so that websites can be liable based on accessory, accomplice, facilitation, or similar legal theories for users' violations of state criminal laws. If their proposal were to pass, it would mean that every website on the Internet could be subject to legal liability for violations of an unfathomable number of state laws. As Matt Zimmerman over at EFF points out, these include such infamous crimes as posting Netflix passwords online, and publishing someone's else's defamatory speech (which is illegal under a number of state laws). No website owner in her right mind would offer an uncensored user forum knowing that the website could be investigated, shut down, or charged with a felony just for one user's speech. We've joined a letter in response to the proposal that was submitted to Congress yesterday.
History shows us that that the likely result – the dramatic chilling of online speech – isn't a theoretical slippery slope. Section 230 was actually passed in response to the dark early days of the Internet, when websites faced lawsuits over speech by their users. Section 230 wasn't passed in order to provide a legal haven for sites hosting illegal behavior, but rather in response to legal claims that sites that remove offensive or illegal user-generated content then become legally responsible for that content. The legislative history of Section 230 refers specifically to a New York state case — Stratton Oakmont, Inc. v. Prodigy — in which a Long Island investment banking firm successfully sued a bulletin board for hosting anonymous defamatory comments because it had exercised "editorial" control to remove "offensive" language. The case had the perverse effect of discouraging sites from regulating offensive (or illegal) content, and led directly to Section 230.
The AGs' proposal would turn the Internet as we know it upside-down. Without Section 230's safe harbor to ensure that websites aren't legally on the hook for content created by their users, websites would be responsible for policing every user-submitted word for possible criminal violations — which simply isn't feasible. Avoiding legal risk would require even the smallest blog to hire an army of lawyers to compare user content against the mosaic of all 50 states' ever-changing criminal laws. More realistically, websites would do one of two things. They would draft their compliance policies to censor user-generated speech to match the most restrictive state law, or they would simply not host user-generated content. It's certainly what their lawyers would advise them to do. If Section 230 is stripped of its protections, it wouldn't take long for the vibrant culture of free speech to disappear from the web. That would be nothing short of a national tragedy.
Section 230's safe harbor provisions have been positive for free speech, resulting in the spectacular diversity of content we now expect online. More than 220 years after the adoption of the First Amendment, the web has fulfilled our foundational promise of an uncensored marketplace of ideas where free speech truly flourishes. Keeping Section 230 intact will ensure that websites aren't punished for providing the soapbox.