Mark Zuckerberg was in the hot seat this week, facing questions from members of Congress regarding the numerous Facebook privacy breaches that have been revealed over the last month. In the wake of these breaches, many are asking: Do Facebook and similar companies need to be regulated?
In considering this question, the U.S. can learn from the approach being taken by the European Union in its new data protection law, called the General Data Protection Regulation, or GDPR. Congress should look to this model and similarly enact comprehensive privacy legislation. Regardless of whether the latest debacles result in better policies at Facebook, consumers should not be reliant on the good will of companies for their privacy. Indeed, CEOs have made promises before only to renege on them as public outrage fades or profit temptations grow.
The GDPR is an example of something many other countries have also adopted: a baseline comprehensive consumer privacy law, which puts into place some broad rules for the fair treatment of people’s information and creates shared expectations and understandings for consumers, businesses, and government alike. It also creates standards that will likely end up extending, at least in part, to internet users in the U.S. As a result, it will likely have a big effect on Americans’ privacy, and not only with regard to Facebook.
The EU has had an overarching privacy law since 1995, but the new law, which goes into effect on May 25, is significantly stronger — despite the lobbying efforts of Facebook and other American companies to weaken the GDPR’s protections.
While the new law only applies in the EU, it will help protect Americans’ privacy in several ways. First, the EU law raises international standards for privacy, and will influence policymakers in Asia, India, and other countries around the globe, creating a broad culture of compliance with privacy rules among the executives of global companies. That will inevitably spill over into how Americans do business.
More immediately, however, companies like Facebook that do business in both Europe and the United States will need to either do business in different ways in each location, or unify their practices to comply with the GDPR. Inevitably, in some cases they will choose the latter.
For example, one provision of the GDPR requires certain companies that control how and when data is processed to give users a copy of the data held about them within a set period of time in a commonly used electronic format. But many companies, especially large, global ones, hold data scattered across numerous departments, divisions, and databases. Such companies are currently working furiously to map and classify the data they hold so that they can comply with data requests and other obligations. That will affect the very structure of companies’ data operations.
Such considerations don’t just affect big international companies like Facebook. A lot of companies will be affected not because they do business in the EU but because they do business with a company that does business in the EU, explained Kurt Wimmer, a lawyer who helps companies think about compliance with the GDPR, at a recent conference. Wimmer also noted that since a lot of startups want to be bought by a larger company, they are starting to think about GDPR compliance as they design their company structures and practices. In fact, any company that wants to be a more attractive acquisition target, or that might want to enter the EU market at some point in the future, can benefit from structuring its operations the pro-privacy way, because restructuring data systems is not something that can be done overnight. As an example of the reach of the EU law, Wimmer said that The Washington Post, a seemingly entirely domestic American company, is planning to comply with the GDPR.
One of the criticisms of the EU’s old data privacy law is that enforcement and compliance were very lax. But the GDPR comes with a very heavy stick: Violations of the law can result in a fine of 4 percent of a corporation’s global annual revenue — a giant sum in the case of a large multi-national. As Wimmer put it, “the 4 percent fine is huge. That really has everybody’s attention.” Before the GDPR, he said, “data protection law was more of an IT or law department thing; now it’s a CEO-level concern, and that’s when companies really get serious about complying.”
One of the most important effects of the GDPR is that it could create pressure on companies like Facebook to provide U.S. consumers the same ability to control their data that the law gives to Europeans. In addition to the right to access one’s data, other important provisions of the GDPR that increase consumer control include:
- Consent requirements: The GDPR requires (absent other specified circumstances) that companies get users’ consent to collect, use, or otherwise process their personal data. The law tries to prevent this requirement from being rendered meaningless by unreadable and unread fine-print click-through agreements. It does that by requiring that this consent be specific, informed, freely given, and granted through an affirmative action or statement by the user. The company has to ask for consent in a manner that is intelligible, easily accessible, and uses clear and plain language. The user is also given the right to withdraw consent at any time.
- Take it or leave it: The regulation presumes that consent is not freely given if a company makes a “take it or leave it” offer, which says, “you can’t use our service if you don’t consent to data collection that’s not necessary for the service.”
- Data portability: The GDPR gives users the right to receive a copy of their data in a “structured, commonly used and machine-readable format” and to have this data transferred to another provider. The intent is to let consumers leave a platform without losing their data.
- Transparency: The GDPR requires companies collecting data to be transparent about their data processes. That means, for example, that users have a right to know how long their personal data will be stored, what categories of personal data are collected, whether they are subject to automated decision making, who else receives their personal data, and the purpose for which their personal data is being collected, used, or otherwise processed. Companies also have to be transparent about personal data they obtain from other sources.
- Limits on marketing uses of data: GDPR provides users the right to object to the use of their data for marketing purposes.
- Automated decision-making: With certain exceptions (such as explicit consent), the GDPR says that people have the right to not be subject to decisions based solely on automated processing if it has a legal or similarly significant effect. Given the rise of predictive algorithms in more and more areas of our lives, from insurance to policing to social services, this is a very significant protection.
The GDPR is not perfect. For example it contains a “right to be forgotten,” which in certain situations requires companies — including in some cases newspapers and search engines — to erase information about an individual upon that individuals’ request. We might not have a problem with a narrowly tailored rule that, for example, forces a company like Facebook to erase a user’s own account and data upon request. But in Europe (where free speech is generally less protected than here) their version is far too broad and if applied in the United States would likely violate the First Amendment by sometimes mandating what amounts to censorship of certain information about individuals.
The GDPR also creates distinct protections for data that can reveal certain types of sensitive information, which may not reflect the full range of information that individuals find revealing. And as with any regulation, we don’t know how it will need to evolve as it is applied and enforced, and how effective it will ultimately be. Will it do enough to ensure real choice and transparency, and to curb the invasions of privacy that permeate so much of today’s online ecosystem? Will it do enough to ensure that consumers have meaningful choices in the market? Time will tell.
But the EU has at least begun to tackle the problem of protecting privacy in the information age. And if the EU can undertake such an effort, there is no reason that the U.S. should continue to lag behind with laws that are ill-equipped to ensure that individuals can meaningfully control their own data.
Mark Zuckerberg, in his congressional testimony, seemed to indicate that Facebook would offer “the same protections” to American users as they will to EU users under GDPR, but equivocated when asked whether Americans would have “all the rights” conferred by the GDPR. We’re not sure what he meant with that distinction, but his equivocation is worrying and Congress should press Facebook to make clear that the company will voluntarily apply all GDPR protections globally.
However, other multi-nationals that collect a lot of data about Americans may not extend GDPR rights to Americans. Regardless of any voluntary action on the part of companies, without U.S. legislation, American users aren’t going to have any legal standing to enforce the rights that EU citizens gain under the GDPR.
We shouldn’t need to rely on the promises of individual companies to protect our privacy. The GDPR will likely bring definite improvements for Americans, but our elected officials need to act as well, by creating a set of baseline privacy protections of our own.