Back to News & Commentary

Three Big Battlegrounds in the Coming War Over National Privacy Legislation

Rendering of data lock
Rendering of data lock
Neema Singh Guliani,
Former Senior Legislative Counsel,
American Civil Liberties Union
Jay Stanley,
Senior Policy Analyst,
ACLU Speech, Privacy, and Technology Project
Share This Page
October 23, 2018

Washington is abuzz about the need for national privacy legislation. In the wake of the Facebook Cambridge Analytica scandal, even typically anti-regulation Republicans are calling for federal privacy legislation. The calculus has also shifted for some industry representatives who have concluded that federal legislation may actually be better for companies than states passing their own laws, which California has already done. There is still a major battle in the works, however, over what such legislation looks like and how meaningful it is.

Numerous industry groups have issued “statements of principles” in the hopes of shaping the conversation over what the privacy law should look like. Many of these statements are lacking in key areas, however — and even with their positive elements, the devil will be in the details on whether they meaningfully empower individuals or end up being no more than broad, bland, and loophole-filled platitudes.

Overall there is growing consensus that a bill must at a minimum:

  • Require transparency so that consumers know how their information is being collected, used, shared, and stored
  • Mandate measures allowing consumers flexibility to move data between services
  • Require notice and consent to use and share personal information (though the definition of personal information and what consent requires may be a point of contention)
  • Notify consumers and regulators when there has been a data breach
  • Ensure that companies adopt reasonable cybersecurity measures to protect against hacking and other breaches.

Despite this common ground, however, there are other fundamental issues where consumer privacy groups and some industry players will likely clash. Those battles will include these three issues:

Pre-emption of state protections

As Sen. Brian Schatz (D-Hawaii) aptly noted in a recent hearing, we would likely not even be having a debate about a federal privacy law but for a data privacy law recently passed in California. Despite successful industry efforts to water the California bill down before passage, it does give consumers rights to know what information companies collect about them, and consumers can opt out of companies selling their information. Industry is opposed to some of the law’s protections, and it’s afraid other states will follow California’s example or pass even stronger laws.

More state privacy laws could be good for consumers — for one thing, companies often find it easiest to apply the strongest law everywhere. After all, it’s not easy to explain to a consumer in South Dakota why they should have less privacy rights than a Californian.

That’s why some representatives of the tech industry are pushing for a federal law that preempts state law — effectively gutting states’ ability to pass laws to protect consumers. A preemption proposal could sweep broadly, foreclosing states from passing new consumer protections, limiting enforcement by states agencies and attorneys general, and invalidating a host of existing protections for sensitive information like Social Security numbers, student data, and more.

This would be a major change. The existing framework explicitly allow states to protect consumers and take action to prevent fraud, even though the Federal Trade Commission is also tasked with this. Other laws, including the Telecom Act, allow states to put in place additional protections for consumers, provided it does not interfere with federal legislation.

In these cases and others, federal law sets a floor — not a ceiling — for consumer rights. Particularly given the rapid pace of technological innovation, we should be wary of a federal law that locks in place limited nationwide standards that will soon be obsolete and blocks any innovation by the states, which are often more adept at responding to new challenges.

Enforcement

Regulations mean little without robust enforcement, which some companies view with trepidation. As a result, much of the coming debate will likely revolve around how privacy standards will be enforced.

Europe has set a powerful example of what enforcement powers can look like. Under its recently enacted privacy regulations, violating companies can be fined up to 4 percent of their global annual revenue — a powerful incentive for companies to take the regulations seriously. In contrast, Federal Trade Commission’s fines are often miniscule compared to the profits of large companies. In addition, the commission is poorly resourced to effectively police industry. The FTC employs only about 1,100 people, with less than 100 attorneys focused specifically on privacy and enforcement. To put this in perspective, Facebook alone employs over 30,000 people and Alphabet, Google’s parent company, has over 89,000 employees.

These factors may be part of the reason that the FTC has not always been the most effective privacy watchdog. Take the breach involving the data broker Equifax, for example. The information of over 140 million consumers was exposed due to what some members of Congress referred to as “malfeasance” on the part of the company. One year later, the company is on track to post record profits, and consumers have not been compensated for the cost of credit freezes the breach made necessary.

State governments and private citizens have often been critical to fill the federal government’s enforcement void. For example, the Massachusetts attorney general is currently suing Equifax seeking damages, and private citizens have filed numerous lawsuits seeking to recover damages from the company.

The Equifax breach highlights some of the elements that are necessary to ensure strong enforcement. First of all, the FTC needs more resources, the ability to promptly level meaningful penalties for privacy violations, and expanded rulemaking authority. Additionally, state attorneys general and agencies must be permitted to investigate and enforce violations of any new federal rules, as well as continue other ongoing enforcement activities. Finally, consumers must have the ability to take companies to court when the rules are violated — an idea that is strongly opposed by many in industry.

What enforcement measures companies will get behind — if any — could be a contentious issue.

Limits on use and retention of information

One of the central problems with the current privacy rules is that they rely on a regime of “notice and consent.” Under the current system, as long as a company informs you (often very vaguely) what it is doing somewhere in a 16-page fine-print click-through agreement, and you “agree” by clicking through that agreement, then the company has covered its bases. Based on the literally impossible legal fiction that consumers read and understand every such agreement, our current, deeply problematic ecosystem of widespread privacy invasions has been allowed to fester.

The reality is that many consumers can’t possibly understand how their data is being used and abused, and they don’t have meaningful control when forced to choose between agreeing to turn over their data or not using a particular service.

Europe’s solution to this quandary has been to put in place regulations that create a presumption that consent cannot be freely given if use of a service is premised on a consumer handing over data that is not necessary for that service. However, many in industry have opposed this measure, in part because imagining new uses for consumer data is how many companies plan to turn a profit.

Any privacy legislation must tackle this problem in a meaningful way. Legislation must limit the purposes for which consumer data can be used, require purging of data within certain timeframes, and prevent coercive conditioning of services on waiving privacy rights.

Otherwise, we risk ending up in the same place we began — with consumers simply checking boxes to consent with no real understanding of or control over how their data will be used.

Learn More About the Issues on This Page