The Threat of Facial Recognition (ep. 39)
Nicole Ozer, the Technology and Civil Liberties director for the ACLU of California, has been at the forefront of debates around privacy and technology for more than 15 years. She joins At Liberty to break down the current state of facial recognition technology and why it raises civil rights and civil liberties concerns.
[0:04] From the ACLU, this is At Liberty. I'm Emerson Sykes, a staff attorney here at the ACLU and your host. Recent headlines have been full of references to facial recognition technology which promises unprecedented convenience as well as new and difficult privacy concerns. Businesses, most prominently Amazon, want to use facial recognition to create a seamless consumer experience while collecting valuable information for its platform and advertisers. And in the hands of government, whether it be local law enforcement or ICE, facial recognition technology opens the door to surveillance on a scale not previously possible.
Our guest today is my colleague Nicole Ozer, the Technology and Civil Liberties Director for the ACLU of California. Nicky has been at the forefront of debates around privacy and technology for more than 15 years. She's here to help us understand facial recognition technology and why it raises important civil rights and civil liberties concerns. Thank you very much for joining us, Nicky. Welcome to the podcast.
Thanks so much. So nice to join you.
Just this week, I received an email from my undergraduate alma mater, and they were proudly announcing that the campus restaurant was now featuring kiosks whereby a small machine would take an image — a face print, they called it — and it would be linked to your order and to your payment method.
Why should this concern us?
Well there are so many questions there.
One is, is it even necessary to be using that type of system? The second is, what's going to happen to all that really personal information once it's being used? How is it going to be protected? Is there a third party that's actually collecting that information and potentially using it in other ways?
[1:59] It wouldn't surprise me if the campus really hasn't asked any of those hard questions and isn't really thinking about the fact that, you know, nobody can change their face. So if that information gets breached or ends up being used in other ways, there's very little recourse for any of those students or faculty or others that might be visiting the campus.
Well one of the things that stuck out to me so much about this example were two things that the co-founder of the company said. One was that they thought that using a face print -- as opposed to, say, a credit card or some other mobile payment method -- was actually going to increase security because it would be harder to commit fraud. And the other thing that the co-founder said was that they were targeting college students because they thought that they were early adopters of this type of technology. What do you make of this claim that, as you said, you can't change your face, so actually a face print is more secure than something like a credit card that could be lost or stolen?
Well I think, you know, it really cuts both ways. You can't change your face, so the fact that this company is collecting face prints on all of these young people who otherwise wouldn't be in a database means that now there's a huge collection of face prints about these young people that are going to perpetuate throughout their lives. So if you have somebody on campus who's a campus activist, now they're going to be in a database potentially in terms of how they're being tracked around campus, how they potentially are tracked in the rest of the world.
What do you make of the fact that young people seem to be more willing to compromise these sorts of privacy concerns? Friends in Silicon Valley have talked about being ahead of the privacy curve, the idea being that eventually, things that seem creepy now may seem normal down the line. Is this really just a generational thing, where young folks are much more comfortable sharing their privacy whereas people who are not natives to the digital era balk at these types of compromises?
[3:56] I've been doing this work for a long time, and young people care as much about control over their information as older folks. There's a lot of of academic research in that all people really overestimate immediate benefit -- sort of the convenience argument -- and underestimate long term risk. That's just sort of behavioral economics. And the fact is that we as people have to be able to underestimate long term risk or else we really wouldn't do anything in our lives. Most of us probably wouldn’t end up having children or driving if we really thought all about the risks.
But young people in particular who have a little less context for that just in terms of where they are in their lives. You know, unless the company has told people that that data is never going to be shared or used in other ways, once it's collected you really don't know how it might end up being used.
And these are incredibly powerful systems. The Amazon system, it's powered by artificial intelligence. It can search against tens of millions of faces in real time, and it can try to identify up to 100 people in a photo. Amazon even recommends using recognition with police body cameras which would really turn this tool that’s supposed to be for government accountability into a tool for government spying. So we've definitely seen a lot of student activists who feel very concerned about what's happening.
That's an interesting point you brought up around targeting of activists or protesters and I definitely want to come back to that. But just as an initial point, you talked about your work around the Amazon facial recognition technology, which is powered by artificial intelligence. And people may be familiar with the posting that the ACLU of California had where the the Amazon Rekognition falsely matched 28 members of Congress with mugshots of people who had been suspected of crimes. And disproportionately, members of Congress who are people of color were inaccurately matched using the Amazon Rekognition.
[6:04] So is the problem that the technology will be too good and the companies will be able to have access to vast amounts of data? Or is it that they're too inaccurate and prone to bias and mistakes?
I mean the problems are both. Right now, we know that face recognition systems are both biased and inaccurate. There's been studies after studies over the years consistently showing that face recognition is less accurate at identifying people of color, particularly women. But accurate face surveillance doesn't mitigate any of the civil rights concerns. What it means is that there is a system that supercharges surveillance and that is really a ready-made tool for the government to be tracking and monitoring communities, including activists, immigrant communities, and other communities of color.
Well it's interesting to hear you talking about the combination of these technologies that are being developed by private companies, but then also the potential use by government. And this exchange of technology between the private sector and government has gotten a lot more attention recently. And I'm just wondering -- so you talked about the surveillance of protesters as sort of the ultimate risk. Is that really the biggest bottom line worry? What about for folks who aren't necessarily as politically active? Are there risks for them as well?
I mean, there's risks for everybody.
It really gives the government unprecedented power to track people and surveil who we are and where we go and who we know across place and across time. And, we can choose not to drive our car or bring our cell phone out on the street, but again, we can't leave our face at home. Here in the United States the government isn't intruding on our private lives unless they have a warrant and probable cause. We’re supposed to be able to live our lives, free from surveillance and without the government reaching in and intruding upon that. So you know I think this has definitely struck a chord.
[8:14] And, you know, face surveillance isn't in widespread use right now. But we have really been working with many other partners to make sure that this infrastructure isn't developed. You know, the government had been quietly building an infrastructure over many years that would enable them to really flip the switch and be able to have this system in place. Right now already, more than 115 million people in the United States are in a matching database. That's because 16 states already share their driver's license photos with the FBI for this purpose. And since 9/11, over a billion dollars a year has been sent down to local communities by the Department of Homeland Security and the Department of Justice to build surveillance infrastructure in the form of video cameras and license plate readers, drones and also police body cameras which were purportedly for police accountability.
All of that surveillance infrastructure that exists right now could be quite easily equipped with face surveillance capability. We know that the government would like to be able to flip the switch and use the infrastructure that they've already developed to do things like a virtual border wall and interior enforcement and local law enforcement. And once it's turned against the public, the harm really can't be undone.
Well it's noteworthy that you're talking a lot about the impact when government gets control of these things, but a lot of what you have done in your office at the ACLU of California has really been targeting the tech companies themselves. Doing your work from San Francisco brings a bit of a different perspective rather than if you were working in D.C., for example.
[10:01] I noted that you published a regular report on privacy and free speech, calling it, “It's Good for Business.” So can you talk a little bit about this strategy of engaging directly with tech companies that are your neighbors in Northern California?
Yes. So, we look at an issue and we think about what are all the different strategies we can employ, like litigation, legislation — and corporate advocacy is also a really important strategy there. The reality is that many of these companies have direct impact on millions of people's lives, not just here in California but across the nation and also across the world.
So for example, about two years ago, we discovered that there were law enforcement agencies across the country that were engaging in social media surveillance. And it turned out that one of the leading companies actually had data deals with Facebook and Instagram and Twitter to be able to get access to information that enabled them to really facilitate the social media surveillance.
And these systems were being marketed particularly to target activists and activists of color. Anyone who tweeted about Black Lives Matter was being characterized as an overt threat. We -- in partnership with the Center for Media Justice and Color of Change -- actually got Facebook and Twitter and Instagram to change their worldwide policies and prohibit use of data for social media surveillance.
Our coalition, our ACLU coalition on face surveillance includes over a hundred other organizations. And we have been pushing the companies to stop providing face surveillance to the government. There's also been a shareholder proposal that has been recently filed against Amazon.
[11:56] None of the companies have committed to what we have been pushing for yet, but Google made a statement that they are not releasing a face surveillance product unless and until some of the major civil rights issues can be addressed. Microsoft has been calling for changes, both internal and also potentially some legislation in this area. And Amazon has been starting to change some of its tune and realizing that it has to be responsive to the concerns by the public, concerns by lawmakers, and concerns by the employees themselves who don't want to work on products that are actually harming their own communities. There were over 500 Amazon employees that wrote a letter to Jeff Bezos pushing the company to stop providing dangerous face surveillance to the government.
Well those are some interesting examples of different kinds of outreach and advocacy that, you and your colleagues have been doing and I want to come back to litigation as another tool in your toolbox. But where do you think the change in policy is most likely to come from? Is it from customers? Is it from their own employees? From management just realizing, sort of seeing the light? Government? All of the above? Where do you sort of see the levers of power in terms of trying to craft a future where facial recognition is not such a risk?
I think it really is a combination of all of the strategies together.
It requires sustained pressure from a lot of different avenues to really create change. I think that's always been the case. You don't pass a law and all of a sudden change happens. That law has to be implemented. There has to be enforcement.
The reality is in the technology space, because the companies are so large and they have so much power right now, engaging in the corporate advocacy is a really important piece as well. And especially in the digital world, the lines between government and companies are particularly thin because you have the companies potentially producing the surveillance technology, but you also have the companies collecting a lot of personal information.
[14:12] The government often tries to reach into those company databases to get that information. That was really the crux of the Carpenter case that went to the Supreme Court: Does the government need a warrant to be able to get access to location information? The answer from the Supreme Court was yes.
That was what we worked on here in California when we passed CalECPA, which requires a warrant for any California government entity to get any electronic information from a company. So here in California, you need to have a probable cause warrant to get location information, e-mails, metadata -- any kind of electronic information requires a warrant.
Our work in California is very much informed by the fact that California has a constitutional right to privacy that was passed in the 1970s by the voters specifically to address the modern threat to privacy brought on by the digital revolution. The constitutional right applies to both the government and to private parties. It's far broader than the Fourth Amendment. So defending and promoting constitutional rights for the ACLU of California means promoting and defending our rights in terms of what companies do as well as what the government does.
So that's sort of our legal landscape in California, but also of course the fact that many of these major companies are based out here. So in terms of proximity, we're able to try to engage with them quite extensively.
Can you provide an example of litigation on this particular issue of facial recognition? I know that you said it's not yet widely used, but I'm wondering if any litigation has emerged yet on facial recognition specifically.
[16:05] There's a case in Florida where a criminal defendant is trying to challenge a misidentification of face recognition and it's very difficult. And the court has not been readily willing to let this person challenge how the computer identified him. And so that's one interesting case right now, is we know that face recognition systems are quite inaccurate. And if you have police that are starting to use it and are using the system to identify people and then arrest them, then those criminal defendants really have to have a way to challenge that identification and be able to say that it is not accurate. And so that's a really interesting case that the ACLU National and ACLU Florida are working on right now.
But generally, you know, we’re trying to make sure that the infrastructure and the uses do not happen because litigation is always after the fact. And so it means that somebody has already been harmed, somebody has already been arrested, somebody's already been surveilled. And we want to make sure that those systems aren't put in place and there isn't infrastructure that's going to be used to target communities and engage in discriminatory policing. It's always optimal to be able to prevent people’s rights from being violated rather than have to wait till after the fact and fix something that really isn't fixable.
Well you’ve painted a really compelling picture of the landscape around facial recognition at the moment.
Maybe if we can finish with a couple of questions around solutions and where we are looking to next. I think one of the things that is always frustrating is when you have privacy experts like yourself and others who are sounding the alarm and telling us all of the implications that we may or may not be aware of for these new technologies.
[17:56] I can think of some of the most tech savvy people I know don't have smartphones, for example. But the general consumer public seems to be generally either complacent or are sort of disempowered by the idea of doing without Amazon or Facebook or Twitter. So how do you sort of bridge that gap between the privacy experts who have a real picture of what's going on with what we can describe as a generally quiet and acquiescent consumer base? Is that an accurate description of the dichotomy?
I think that technology is really no different than a lot of other areas where there's a lot of information asymmetry. The general public doesn't necessarily understand exactly how certain pharmaceuticals are produced, but they don't necessarily have to because the FDA exists to make sure that there are drugs produced in a safe way. And tech is in a situation where an entire industry has been developed over the past 15 or so years that really hasn't been given many rules.
The Web just turned 30 this month. And the creator of the Web said it's now time for the Web to get out of its adolescence. I think that we're at a real turning point in terms of making sure that the technology industry doesn't continue on without having any rules, that make sure that people are being protected.
Well it seems that you and your colleagues at the ACLU of California have generally had the orientation of trying to slow things down a bit, make sure everyone understands what's at stake, both from the company side and from the consumer side, as well as the implications for government.
I noticed that you've called for a few moratoria -- a moratorium on using facial recognition for law enforcement, a moratorium on using it in federal cases. And I'm wondering are we too far gone for a moratorium? Is the cat out of the bag on facial recognition?
How is it going trying to pump the brakes on this breakneck development?
[20:00] You know, I think that we are at a perfect time.
The infrastructure has sort of been quietly developed, but it isn't being widely used right now. It's why the ACLU has called for a federal moratorium on its use for immigration and law enforcement.
The ACLU of Massachusetts has introduced a moratorium in that state. The ACLU of Washington has introduced a moratorium in Washington state. There's a legislation that's been introduced here in San Francisco where no city or county entity would use face surveillance. And we expect many other state bills and potentially other legislation down the line as well.
And it's also academics across the country who are calling for the companies to not provide face surveillance to the government, for there to be pauses, at a minimum, on any kind of deployment. And many are also calling for it just to not be used at all by the government.
Well if you had to look forward over the next year, two years, five years, where do you see this issue going?
There's a lot of flux right now in terms of, you know, don't know who's going to be in power in the next couple of years, and kind of what kind of decisions are being made.
I think that as these systems get even more complex and harder to kind of peek under the hood in terms of artificial intelligence, it's all the more important to make sure that there are real laws in place. What's technologically possible doesn't mean it should happen. And I think that there's no more important issue in this space than making sure that face surveillance isn't in the hands of the government.
Thank you very much, Nicky. I really appreciate you taking the time to speak with us today.
Thanks so much for having me.
[21:56] Thanks very much for listening. We hope you enjoyed this conversation. If you like what you hear, be sure to subscribe to At Liberty wherever you get your podcasts and rate and review the show. We appreciate the feedback. Till next week, peace.