Back to News & Commentary

Ruling Is a Warning to Companies Collecting Biometric Scans Without Permission

Facial Recognition Test
Facial Recognition Test
Nathan Freed Wessler,
Deputy Director, ACLU Speech, Privacy, and Technology Project
Share This Page
February 8, 2019

The Illinois Supreme Court issued an important decision in late January rejecting attempts to gut the state’s landmark law that bars companies from collecting people’s biometric identifiers — including face recognition scans, fingerprints, and iris scans — without providing a written explanation of what they plan to do with the data and obtaining consent.

The law, called the Biometric Information Privacy Act (BIPA), has been on the books for over a decade. It’s the strongest such law in the nation, and it has provided a robust tool for protecting some of Illinoisans’ most sensitive data against covert collection, use, and resale.

The question at issue in the case concerns who is allowed to sue for violation of their rights under the law. The lawsuit was brought by the family of a teenager whose thumbprint was scanned when he went to a Six Flags’ amusement part. Contrary to the requirements of the law, there was no explanation of why he was fingerprinted or how the data would be used.

BIPA allows anyone “aggrieved” by a violation of its provisions to seek monetary damages and other relief. The defendant in the case, supported by a number of organizations representing businesses that seek to collect biometric data — including tech giants like Facebook, Google, and Amazon — argued that someone can only be “aggrieved” if they can prove that they have suffered actual damages, such as monetary loss or other concrete harms.

As we explained in a friend-of-the-court brief, however, that interpretation would often leave “no means to hold wrongdoers accountable for their violations of BIPA’s notice and consent requirements” because “privacy harms are difficult for the consumer to understand at the outset and discover after the fact.” (In addition to the ACLU and ACLU of Illinois, the brief was joined by the Center for Democracy & Technology, Chicago Alliance Against Sexual Exploitation, Electronic Frontier Foundation, Illinois PIRG Education Fund, and Lucy Parsons Labs).

In a unanimous opinion, the Illinois Supreme Court agreed, holding that “an individual need not allege some actual injury or adverse effect, beyond violation of his or her rights under the Act, in order to qualify as an ‘aggrieved’ person and be entitled” to sue. The court explained, quoting a lower court’s ruling in another case:

The Act vests in individuals and customers the right to control their biometric information by requiring notice before collection and giving them the power to say no by withholding consent. These procedural protections “are particularly crucial in our digital world because technology now permits the wholesale collection and storage of an individual’s unique biometric identifiers — identifiers that cannot be changed if compromised or misused.” When a private entity fails to adhere to the statutory procedures, as defendants are alleged to have done here, “the right of the individual to maintain [his or] her biometric privacy vanishes into thin air. The precise harm the Illinois legislature sought to prevent is then realized.” This is no mere “technicality.” The injury is real and significant.

The court’s ruling ensures that the Illinois law remains a meaningful tool for protecting against invasions of privacy. It will have an immediate effect in other cases, including a lawsuit challenging Facebook’s collection of face recognition scans — also filed under BIPA — in which we filed a friend-of-the-court brief late last year.

The decision also stands for a larger principle that in an age when companies have ever greater abilities to amass and monetize our personal data, it is crucial that Congress and state legislatures provide strong laws that both protect people’s rights and allow them to sue when companies violate the law. Legislators should reject self-serving industry arguments — similar to the ones made in this case — that consumers don’t deserve the right to take companies to court unless they can prove monetary or concrete harm.

As my colleague Neema Singh Guliani recently explained in The New York Times, “Huge privacy violations have become commonplace. Without a private right of action, consumers have little practical ability to seek relief in cases where their data was mishandled or misused.”

Lawmakers nationwide would be wise to follow Illinois’ lead and ensure that people throughout the country have a way to defend against surreptitious or misleading uses of their biometrics and other private and sensitive data.