Episode 3: The Eyes On Your Face

May 06, 2022

In Episode 3, hosts Walt Shaub and Virginia Heffernan grapple with government surveillance, focusing in particular on facial recognition technology and the ways that the government could — and already does — abuse this pervasive technology. Facial recognition expert Jake Laperruque describes the arms race between technology companies developing new software to be used by law enforcement on one side, and lawmakers and privacy advocates working to regulate this fast-developing technology on the other. Framing privacy as a human rights issue, Laperruque reminds us of the abuses of the post-9/11 era, when the government carried out indiscriminate surveillance of Muslim communities.

In our next interview, we step back and look more broadly at how targeted surveillance of all types can harm vulnerable populations, a problem that technology like facial recognition software exacerbates. Law professor and former prosecutor Paul Butler explains how the Supreme Court has unleashed the forces of control on minority populations. Finally, Maya Wang, the senior China researcher for Human Rights Watch, paints a picture of the future we must avoid. As always, the show concludes with ideas about the actions listeners can take to help address these wrongs. This episode features some good news, as the hosts break down a recent victory by privacy advocates that forced the IRS to back off a plan that would have force some Americans to use problematic facial ID tech to access their tax records.

The Continuous Action is sponsored by The Project On Government Oversight.

Stay tuned on the latest from POGO, and don't forget to subscribe to The Continuous Action on Apple, Spotify, Stitcher, or Acast.

Walt Shaub: This podcast is sponsored by the Project On Government Oversight, a nonpartisan independent government watchdog.

Virginia Heffernan: Hello, and welcome to The Continuous Action. I’m Virginia Heffernan. And this is episode three. Today we are talking about government surveillance — and, in particular, facial recognition

Walt Shaub: And I’m Walt Shaub. Virginia and I are going to explore the issue from a domestic perspective to see what’s happening here in the United States, then we’re going to look at the dystopian world of government surveillance in China to get sense of what can happen if there are no limits.

Virginia Heffernan: I mean, based on our conversation with one of our guests, it sounds like the only limit on surveillance in China is the limit on technological capacity. If they can do it, they do it. And technology is constantly evolving. So those limits are going to kind of disappear with time. We are going to hear from Maya Wang on that. She’s a China researcher with Human Rights Watch who has a deep knowledge of China’s surveillance apparatus.

Walt Shaub: Before we talk to her, we’ll hear from a law professor I admire greatly, Paul Butler, and we’ll hear from POGO’s own Jake Laperruque, who’s a government surveillance expert. The evolving capacity of technology is a theme we’re going to hear from Jake. Jake’s a senior policy council with The Constitution Project, which is part of POGO. And he’s spent years studying this issue of government surveillance. He recently testified before the Maryland and Massachusetts state legislatures to try to get them to impose legal limits on use of technological surveillance.

Virginia Heffernan: It’s basically a race between technology and the law, or maybe a danse macabre. I know you like it when I use French phrases, Walt.

Walt Shaub: Absolutely. yeah, you’ll hear Jake say that the authorities seem to believe it’s better to say you’re sorry than ask for permission. They assume they’re allowed to use a technology until someone stops them.

Virginia Heffernan: And law seems to evolve slower than technology does. So, you have to get a majority in both houses of state legislature or Congress to pass a law. That means educating and winning over lawmakers who are mostly not technology experts.

Walt Shaub: What’s interesting though, is that this seems to be an area where there have been some promising developments. You’d think the big companies have all the advantages. They’ve got the lobbyists, they’ve got the contacts, they’ve got the money, but privacy advocates are at least holding their own so far.

Virginia Heffernan: Yeah, that’s the message I got from Jake too. At the same time, I did hear him caution that it’s going to be an ongoing struggle. It’s going to take a concerted effort, continuous action if you will, from civil society to keep up with technology and keep lawmakers focused on protecting us.

Walt Shaub: I think that’s right. And that’s why we have to stay mindful of what can happen if there aren’t safeguards. I also think it’s really crucial to keep in mind that there’s an equity issue here.

Virginia Heffernan: Yeah. I mean, government surveillance clearly affects minority populations more than anyone else. And it’s long been an issue for this country.

Walt Shaub: And it matters. Paul Butler talks about how surveillance targeted disproportionately at my minority populations has consequences.

Virginia Heffernan: You know, clearly, if you watch pretty much anyone long enough with this kind of apparatus of government surveillance, you’re going to find some minor infraction.

Walt Shaub: Yeah, that’s exactly right. I mean, maybe they did a rolling stop at a stop sign or failed to signal a lane change. Bingo! You’ve got an excuse to stop them. Or an officer sees someone walking down the street, and they find some minor excuse to stop them. Butler has said this law enforcement intervention “provokes a reaction.” And that’s exactly what happened in the case of Freddie Gray in Baltimore, a perfectly healthy young man who wound up dying the back of a police van when he’d done absolutely nothing wrong. They stopped him, he panicked, he ran, and then they killed him.

Virginia Heffernan: People like to think if they’re not doing anything wrong, there’s no reason to care about surveillance, but it changes a society.

Walt Shaub: Yeah. Let’s listen to our interview with Jake Laperruque.

[music]

Jake Laperruque: Facial recognition is an artificial intelligence tool. It’s been around for a while, but really it started to blow up in terms of law enforcement and surveillance use over the past decade. I think, you know, a lot of people imagine it’s something like in Minority Report or CSI, that it’s mostly sort of like a sci-fi tool. But the fact is it is a very common policing tool today. Over half of adult Americans have their photo in what could be some sort of facial recognition database. As far as we know, and this information’s very limited, so I assume it’s higher, at least a quarter of all police departments have the capacity to run face recognition searches.

You know, there’s a very big range of how exactly the tech works, what it does. But in principle, basically, this is taking your face from a photo or a video, running it against a database of often millions or tens of millions of photos — probably mugshot databases, DMV photo databases, things like that — and instantly identifying a set of potential matches. Usually it’s not just, “Oh, here’s a photo. Oh, that’s Jake.” It’s, “Here’s a list of five people, 10 people, 20 people, including Jake, Walt, so on, et cetera.”

And law enforcement officers use that primarily now to identify people in photos. You know, typically that should be just a lead, but sometimes it’s used, basically, as the primary reason — or the sole reason it’s been used sometimes that we know of — to arrest an individual and put them in jail.

Walt Shaub: And Jake, you and I have talked about an incident that happened in Baltimore involving some protestors. Can you tell us about that and tell us whether it was an isolated incident or not?

Jake Laperruque: Yeah, so there was one instance — this was about seven years now during the mass protests that happened in Baltimore after the death of Freddie Gray in police custody. You know, there was a lot of protesting, some rioting after that. And one thing police were doing, basically, was they used a service called Geofeedia.

This is a facial recognition service, among other things, that does social media scanning. It did facial recognition scans of photos people were putting up on social media: Facebook, Twitter, Instagram, et cetera. And it would scan to figure out who these people at the protests were. And what Baltimore police did, is they took that info, figured out if they had any sort of past old bench warrants or things like that, then were arresting them from the crowd for it.

So, this wasn’t, you know, someone in the moment looting or rioting or engaging in violence. This was using facial recognition to find some very old offense that they were not choosing to enforce and choosing to enforce it because the person was there at a peaceful protest.

There have been some more recent instances where we’ve seen this — or not quite this, but similar activity too. There was report from the Florida Sun Sentinel last year that, throughout the 2020 civil rights protests, there were numerous cities in Florida that all were basically using facial recognition to identify protestors — again, peaceful events, not individuals that were, you know, engaged in any sort of criminal activity or rioting — but just, you know, “Who is this protestor? Who is that protestor?”

Virginia Heffernan: So, are technologists from Google or other private companies, big tech, big data involved in getting facial recognition technology, either implementing it or developing it?

Jake Laperruque: Yes. There’s several major tech companies that do develop and sell the tech. Google’s always been very hands-off on facial recognition, which is interesting. Microsoft and Amazon are probably the two biggest, most well-known tech companies that do sell facial recognition. An interesting thing, though, is in the past couple years, they’ve both put a moratorium on selling to police and law enforcement because they said the tech isn’t quite accurate enough; there aren’t basic rules or restrictions on how it’s used.

I don’t think that those moves are going to be sufficient to guard against the tech. I mean, there are lots of companies that sell this that exclusively focus on selling police equipment, police tools. They don’t have a public profile like Microsoft and Amazon. They’re not worried about consumer boycotts. Even if it is problematic or is publicly opposed, they’re going to keep selling this tech. So, there’s always going to be, you know, a market out there, unless we actually put in rules and laws to limit it.

Virginia Heffernan: All right, so Jake, can you tell us a little bit about surveillance of Muslim communities after 9/11?

Jake Laperruque: After 9/11, you know, there was a very pervasive and very insidious surveillance operation directed at Muslim communities around New York. The NYPD, with some federal assistance, basically set up a Muslim surveillance unit to — on a massive scale — monitor Muslim communities. Monitor, you know, not individual suspects, just what these communities are doing.

They would have informants that they called “mosque crawlers,” you know – very gross term, but that was what they used – to basically go into mosques, go into Muslim community centers or events and catalog everyone who is there. You know, not, not sort of, “Hey, this guy that seems very suspicious, there’s something wrong and we need to watch,” just, “Who is everyone at these events? Who is everyone at these religious services?”

Now, as large scale and problematic as that was, that depended on, you know, having people engage in these activities, finding someone and having a lot of manpower devoted doing it. You know, what it would take a week for one of those “mosque crawlers” to do you could do with a facial recognition system in a second, by just having a camera or a photo of a ceremony and clicking a button.

So, you know, while there are a lot of instances of that, you know, throughout history, whether it’s the NYPD’s Muslim surveillance unit, the Hoover FBI’s infiltration of civil rights organizations, of anti-war movements, of labor movements, you know, there is a very long and dangerous history of sort of using surveillance to identify, stockpile information on, catalog any type of government dissident or any type of marginalized group — be it religious, racial, et cetera — and then potentially target them for selective prosecution, potentially just leak sensitive details about their life.

A tool like facial recognition makes the manpower level so low, makes the government’s power to surveil so high and so easy, that if you don’t put limits on it, it takes those types of dangers of abuse to a scale we’ve never seen before

[music]

Jake Laperruque: Everyone has something to hide. You know, if I, if I ask either of you guys, “Can I read out your entire text message chain on this podcast right now?” I think you would, you know, even if you weren’t doing anything illegal at all, still find that kind of a queasy thing to do, because we just, we have private spheres we want.

But I think, you know, the biggest issue is when people say, “I don’t worry about my privacy personally,” is: Think about the Fourth Amendment and privacy rights, you know, not as just a civil liberty, but as a civil right. And think about civil liberties in general, that way, you know, we don’t only worry about religious freedom if we’re a religious minority. We don’t only worry about, you know, equal protection in the 14th Amendment if you’re a racial minority. These are principles that we believe are foundational to a democracy and a free society.

Privacy is the same way, because surveillance is used throughout history as a way to oppress marginalized groups, to oppress dissidents, to chill activism and speech. So, I mean, even if it’s not something you feel that you have a stake in personally, and have worries about personally, you know, care about it for those type of marginalized groups in the way that you would with free speech, freedom of religion, equal rights.

Walt Shaub: I saw that you testified before the Maryland state senate and told lawmakers there were five key priorities for limiting the use of facial recognition. And so maybe that would help us understand the line between what you think is appropriate and what’s not appropriate. Can, can you talk to us a little bit about that list?

Jake Laperruque: Yeah. So, I’ll say, kind of at the offset, I mean, we think probably the best way to take this would be to press pause on facial recognition. And, you know, way too often, we have debates where the government takes a sort of, “It’s better to ask forgiveness than permission” attitude towards surveillance. And instead of having a debate where we say, “Should we authorize this? If so, under sort of what rules and structures?” we found out, “Oh, the government’s been doing this incredibly invasive surveillance programmer tool for years, and now we have to fight to get any limits on it.” That that’s not a, you know, sort of very natural or healthy way to create policy, but it’s unfortunately just how surveillance often works.

So, with that caveat in mind — of, you know, I would much prefer we were talking about how to authorize facial recognition, not how to limit it, if we wanted to at all — I think, you know, the sort of the five key limits at least that you need are a warrant rule, a limit to use in serious crimes. So, this is something that might be used to solve a murder but is not used for someone who shoplifted $5 of goods or someone who is jaywalking, where it can be very selective, and the enforcement can be subject to all kinds of misuse.

You need to have it disclosed to defendants. You know, you wouldn’t have someone with some sort of sketchy forensic tool like fingerprints a hundred years ago, just have that, you know, hidden from them. If it played a huge role in a case, you’d want to have them be able to analyze it. You shouldn’t have it be this sole basis for arrests or searches. You know, this is something that maybe can help as a lead, but it should not constitute probable cause.

And then, fifth, we shouldn’t allow sort of this sort of untargeted scanning of massive crowds, the type of thing that we see in nations like China now, where instead of saying, “Here’s someone I have probable cause that they are, you know, that they engage in a serious crime, and I want to identify who they are,” that you just basically scan everyone that walks by, you know, a camera on the street and saying, “Who is that? Who is that? Who is that? Who is that?”

Virginia Heffernan: One thing we also, yet another thing we learned from January 6th and also from the proliferation of so-called reverse-surveillance videos, including the “Karen videos” that were so pervasive during the pandemic, is that we are all — either in the form of reverse surveillance of us or just in the form of our general vanity online — uploading photos of ourselves to databases all the time. All the Instagram challenges to age yourself, or to, you know, change your appearance in some way where you’re just chronically uploading selfies. Can the government use those videos and images, some of which, in an extreme case, might be deep fakes or might be altered to point to other players? How much access do they have to our private data?

Jake Laperruque: Yeah, so they do use those photos and videos. The way that happens is, right now, primarily through a company called Clearview AI that you might have heard about. What they do is — I said before that, you know, typically law enforcement databases are built on things like mugshot databases, DMV photo databases. What Clearview does is they basically went to Instagram, Facebook, Twitter, you know, other social media sites, scraped off billions of photos from them, basically every single photo they could get a hand on, through sort of tech tools that let them do this en masse to billions of photos at a time, and then run them in a facial recognition system and keep them there. Both to identify people — like, you know, your profile picture and others will help them identify who you are against a scam — but also, you know, they might just want to scan a million photos of New York and see if this person they are looking for pops up in the background of any of them.

It’s important to point out that, even though those are kind of going up online, none of these sites authorize that. In fact, they all explicitly in the terms of use prohibit this type of grabbing photos en masse. So, I think, you know, a lot of time people talk about private versus public and oh, that “You’re, you’re out in public, you’re out, you know, you’re putting that photo publicly.” But it is with conditions. You know, a lot of people, they only share photos of friends. They only share your photos with circles of associates or even like smaller groups of friends and a baseline condition of, if I’m putting my photos on Facebook or Instagram, the company’s saying, “This is here for you to use; this is not here for some company to grab up en masse by the billions.”

So really what we’re talking about is a company stealing your photos without consent and then putting them in a surveillance database.

Walt Shaub: Wow. And talk about creating databases, Jake, you recently tangled with the IRS trying to create a massive database. What happened with the IRS?

Jake Laperruque: So, the IRS was talking about potentially, well, not even potentially — they had full plans to go ahead using a data fraud prevention company that basically uses facial recognition to make people hold up their phone and run [face] scans on their phone to log into accounts. You know, data fraud is a very serious issue, but the idea of doing biometrics as a way of doing it is just very bad.

This company, they’d previously — they still operate in a lot of states for unemployment benefits, and they’ve had huge problems. People will hold up their phone, and the facial recognition system will misidentify them, and they’ll end up spending, you know, two, three, four hours trying to get ahold of someone and get through. To just say, “I am who I say I am, this is me. And can I please have this unemployment assistance that I’m entitled to and really need?”

So that system was going to potentially be something that was required for every single person who wants to access tax services online for IRS.

Walt Shaub: So, you’re saying they would have to show their face to the camera to be able to interact with the IRS?

Jake Laperruque: Yeah. To use a set of services that IRS provides online, the plan was to make them kind of go through this automated facial recognition verification system. Which, you know, probably the most immediate concern for that was that the system does not seem to work very well. You know, lots of facial recognition tools have lots of issues. There are algorithmic bias issues: Women and people of color tend to get identified in most algorithms less frequently in an accurate manner.

Low quality images, or images from inconsistent angles, low lighting tend to produce errors. So that was the main concern there. And that’s another one that rides a lot into the law enforcement issues. But also, it was a question of, well, once you start to require this, I mean, does it lead to stockpiling of photos that could be used for other purposes?

Fortunately, we and a lot of other advocates were very vocal about citing those problems and the IRS backed down, and now they will not be pursuing that plan. So, you know, it’s an issue that’s going to keep coming up for verification. But I think it was, it was a very good example sort of for all the talk that, “Oh, maybe privacy’s dead in the digital age,” that a lot of people do care about it and can make an impact on it.

Walt Shaub: Jake, is there sort of an arms race between technology and the laws? Are the lawmakers keeping up with the technological advances?

Jake Laperruque: Yeah. I mean, this is why I sort of — why I can sometimes feel hopeful about privacy and surveillance. I also feel like, it’s sort of, we’re never going to quote-unquote win this issue. It’s always going to be sort of a, you know, you have to stay vigilant on, because the pace at which tech moves is just consistently overpaced the law. Like I said before, you know, the government always takes a sort of “ask forgiveness, not permission” attitude towards surveillance.

You know, we saw that with the NSA and bulk collection and all the revelations that came out in the mid-2000s, we see it with drones, with facial recognition, with location tracking — all of these kind of new tech tools. So aside from, you know, we, we have that kind of issue with the process kind of working backwards and how we make the laws, but also just ... even when lawmakers are vigilant, it’s just — there’s always something new coming out.

And you know, and at a time when I think we very often worry about the future of our democracy, that type of value and need to sort of balance government power really shines out. There’s a lot of battles ahead, whether it’s these kind of issues of cell phone tracking and filling some of the loopholes there that I mentioned, or something like facial recognition or drones and arial surveillance, which have, there are a few good state laws, but really has been left to wither as far as federal issues.

You know, on the one hand, I think there’s a lot of belief that we need to fix those, and deal with those issues, and a lot of broad support for it. But it is a lot of work to do ahead. So, I feel somewhat optimistic, but also sort of bracing for a marathon.

[music]

Virginia Heffernan: I mean, it’s useful that Jake gave us a roadmap for legislation that can reign in government surveillance. Can you run through that list again?

Walt Shaub: Yeah. Jake listed them in his testimony for the Maryland state legislature recently, and here’s the list he gave lawmakers. Number one, requiring that face recognition searches are based on probable cause. Number two, limiting the use of face recognition to the investigation of serious crimes. Number three, prohibiting face recognition from being used as the sole basis for arrests. Four, requiring notice to defendants whenever face recognition is used. And five, prohibiting face recognition from being used for untargeted surveillance, sort of that crowd sweep that he talked about.

Virginia Heffernan: So, when he was talking about needing a warrant, that’s tied to the first item, requiring that face recognition searches are based on probable cause, right?

Walt Shaub: Yes. The idea is that the law should require law enforcement to show there’s probable cause to suspect that a crime has occurred before they can even run the facial recognition software.

Virginia Heffernan: So, who decides if they have satisfactorily shown there’s probable cause? Who makes that decision?

Walt Shaub: Who decides is a crucial issue, and the cops are always going to think they have probable cause, which is why it’s best if you require a judge to make that call. You know, if you require the cops to get a warrant, then the judge is the one who’s going to look at an application the police submit and decide if there’s a sufficient basis to run the photo through facial recognition.

Virginia Heffernan: I mean, it seems like there’s also a problem if you’re using face recognition for minor offenses. If someone steals a lipstick from a drug store, it seems totally out of proportion to bring in that kind of heavy technology. And a defendant who’s accused of stealing a lipstick probably can’t afford to run out and hire expert witnesses to disprove a facial recognition.

Walt Shaub: Right. That’s the idea behind the second item on Jake’s list, which is that you should use the technology only for serious offenses. I asked Jake about that recently and he told me in reality, the defendant probably wouldn’t even be aware that facial recognition was used. So, if the technology comes up with a false positive and the police grab the wrong guy, he or she won’t even know that was why they were fingered for the crime. And that gets at another item in Jake’s list: The defendant should be told whenever facial recognition was used.

[music]

Walt Shaub: Now the focus today is facial recognition technology, but it’s important to recognize that facial recognition is just one tool in the government’s surveillance arsenal. And like all tools, facial recognition is susceptible to whatever baggage the people using it carry with them. If there are problems in the way government conducts surveillance, problems like abuse of power and bias, then advancements in technology are only going to magnify those problems. You can’t correct a social problem – a human problem – with a technology solution. Sometimes you can make it worse, though.

If the government is using surveillance to target certain populations, then a technology like facial recognition is going to make it easier to target those populations by increasing the government’s capacity to zero in on those populations. For that reason, it’s important to reflect on how the government is already abusing surveillance capabilities by targeting vulnerable populations.

In particular, events like the killing of Freddy Grey in Baltimore, George Floyd in Milwaukie, Eric Garner in New York, and so many similar tragedies have got me thinking about how the government targets Black people for heightened scrutiny in particular. I think if we look at that issue, we can glean some insight into a bias that’s already influencing law enforcement practices and understand a little better why it’s so important to regulate the use of facial recognition technology.

A little later, we’ll round out the picture by looking at a worst-case scenario when we hear from Maya Wang about how facial recognition technology and surveillance in general are being deployed against Muslims in China. But for a look at the issue of bias in surveillance right here at home, I thought there was no one better to contact than Paul Butler. He’s a former federal prosecutor who’s now a law professor at Georgetown University, and he has thought a lot about this issue.

Paul Butler: If you go to criminal court in the district of Columbia, you would think that white people don’t commit crimes. They are almost half of the city’s population, but they almost never get prosecuted. The message from the superior court in DC, like courthouses all over the country, is that white people don’t use drugs; they don’t steal; they don’t get into fights; they don’t commit sexual assaults. But Black people, Latinx people, man, those are some bad dudes.

This is the message sent at criminal courthouses for the misdemeanor crimes, which constitute the vast majority of arrests. And, Walter, obviously white folks commit these crimes all the time. The evidence shows us that white people use drugs just as much as Black people, but 60% of the people locked up for that crime are Black and Latinx. And one reason for these vast race disparities is this laser focus on Black communities when police are enforcing certain crimes.

They claim that they’re going where the drugs are. And Walter, I love my students at Georgetown, but I want to respectfully say I that there is as much illegal drug use in our communities as there is in Anacostia, which is a low income, primarily Black community in DC. But the cops are heavy up in Anacostia. They are not heavy up in Georgetown dorms. And it turns out that there’s this important relationship between looking for things and finding things. One reason the police find more crime in communities of color is because they’re looking at our communities more.

And one great example is that in New York, the time that the police make the most arrest is – I have it in my book – I can’t remember the exact time, but let’s say it’s Wednesday at 2:00 PM in the afternoon. Is that because for some odd reason, that’s when people are committing crime? Is it kind of like a, a full moon thing? No. It’s just, that’s the time when they are the most police officers on duty.

Walt Shaub: Paul, you once told a story, and I don’t know if you remember it, about a police officer who took people on a ride along to show them how you can pretty much find anything you’re looking for.

Paul Butler: This is a buddy of mine who was a police officer in DC for over 20 years. And to show the students in my criminal law class how much power he has, he would invite them to go on a ride along. And that means that they’d sit in the back seat of his squad car and accompany him through the mean streets of DC to see what it’s like to be a cop. And to demonstrate this extraordinary power to search, to seize, to watch, he’d invite the students to play a game, which he called “stop that car.” He’d tell the student, “Pick any car you want, and I’ll stop it.” So, the student might point to say, "That white Camry over there.” He’s a good cop. He waits until he has a legal reason. But he says that he can follow any car for a few blocks, a couple of minutes, and find a reason to pull it over. There are so many traffic infractions that we all commit them.

And this gives an extraordinary power, because the Supreme Court has also ruled that when police officers stop a car, they can order the driver to get out. They can order the passengers to get out. They can ask, “Can we search your car?” without telling you that you have the right to say “No.”

Walt Shaub: In 2000, the Supreme Court really deepened inequity in our country with an appalling concept it called “high crime areas.” Can you tell us a little bit about what that theory allows?

Paul Butler: There was a case called Illinois v. Wardlow in which Chicago cops were on a routine patrol in a Black community. And they saw a man see them. And when the man saw them, he took off running. At least that’s what the police claimed. The police ran after him and stopped him. They seized him. He couldn’t go. Why did the cops do that? They say, “Well, because he ran when he saw us.”

“Did you have any reason to suspect him of a crime before that?”

“No, we didn’t.”

The question before the Supreme Court in Wardlow is, can the police do that? Can they actually seize you? Can they detain you even when they don’t suspect you of any crime, but they just want to see why it is that you didn’t want to be bothered with the cops. And this answer is still unbelievable to me, even though I’ve taught this case in my criminal procedure class many years. Walter, what the court said is that the police can stop you in a high crime area, but not a low crime area.

So, what it means is that if you’re in a “high crime area” — and the way that courts interpret this, pretty much, every community of color is a high crime area — and you communicate that you don’t want to interact with the cops: You turn away from them. You walk away from them. The police can detain you to see why you did that. If you do the same thing in a non-high crime area. let’s say, Chevy Chase, DC, a rich, mainly white suburb, the police have to leave you alone.

So this is explicitly Jim Crow: Differential police power in communities of color and rich white communities. And it’s perfectly constitutional. And that’s why the theme of my book, Chokehold: Policing Black Men, is that the problem isn’t bad apple cops. The problem is the system is working the way it’s supposed to. Our criminal legal process is set up to give the police these superpowers, which they almost exclusively use in Black and brown communities.

Walt Shaub: Paul, you also once said that (and I’m quoting here), “surveillance always demands a response.” And that just sent chills down my back. Can you talk about that?

Paul Butler: The government isn’t looking at you because it has nothing better to do. When the police are laser focused on, let’s say, Black men, they’re looking for a reason. And the reason, I think, is they want an excuse to mess with you. To touch your body in the way that their power to stop and frisk gives them. They’re looking for an excuse to demonstrate their dominance and their control.

And so, if they’re really concerned about who’s using drugs, if they’re really concerned about disorder in communities — again, Adderall on university campuses, illegally obtained Adderall, is an issue. It’s as much of an issue as illicit drug use is in communities of color. But I don’t think that when they’re enforcing these low-level misdemeanor crimes, or when cops are acting as warriors in the war on drugs, that it’s about public safety.

If it were really about public safety, they’d be more focused on white communities. So I think that the diagnosis that we hear from the Movement for Black Lives is the right diagnosis, that now our police departments aren’t so much operating as departments of public safety, but rather they’re mainly about surveillance and control of poor people, of Black people, of Latinx people, of transgender people, of immigrants, of a lot of communities that for a long time have been considered outsiders — people who, who don’t deserve the protection of the Bill of Rights.

And while now that’s formally illegal and unconstitutional, to treat us differently, every study of the police demonstrates that they still do.

One crazy example is from the Ferguson report, which is the report that the United States Department of Justice prepared about its investigation of the police department in Ferguson, Missouri, which is where an officer killed Michael Brown. The Ferguson report is this amazing synthesis of data and stories. The data: Every time the Ferguson police used a dog, they used it against a Black person.

But Walter, just one quick story to crystallize the way that the police use their power to surveil against Black and Brown people. This story is in the Ferguson report. You can find it online. She called the police because he was beating her up. By the time the police got there, he was gone. The cops look around the apartment. “Does he live here?” She said, “Yes, he does.” The police said, “You’re under arrest. His name is not on the occupancy permit.” And that’s a violation in Ferguson. Walter, when that happened to another Black woman in Ferguson, she said she would never call the police again. She didn’t care if she was being killed.

Walt Shaub: That’s heartbreaking. Paul, I think the last thing I wanted to do is just talk about how surveillance can lead to tragic results. And the first thing that comes to my mind is Freddie Gray in Baltimore.

Paul Butler: So, I mentioned this case called Wardlow, in which the police are given the power by the Supreme Court to detain folks in high crime areas who the police think are trying to evade them — again, even if there’s no reason to suspect them of a crime.

Freddie Gray was a young Black man in Baltimore who wasn’t doing anything illegal. He was just in the wrong place at the wrong time. And the wrong place meant a Black neighborhood in Baltimore. The wrong time meant that there were cops around, including some officers who were on a bike patrol. When Mr. Gray saw those officers, he turned around and walked in the opposite direction. And that’s when the police seized him. Not because he committed a crime, but because he made the decision as an American citizen that he didn’t feel like being bothered with the cops that day.

When the cops seized him, they also searched him. They frisked him. They patted him down, and they found a knife which most people think he legally carried. We’ll never know because there was never any criminal prosecution, because they arrested Mr. Gray, they put him in a police van, and by the time that he reached the station house, his spine was destroyed. And Mr. Gray later died.

And the police were able to stop Mr. Gray to search his body because of this incredible power that law enforcement has to control us. When Mr. Gray’s only offense was to communicate to the police that he didn’t want to interact with them. And for that right, that every citizen should have, Mr. Gray paid with his life.

Walt Shaub: Professor Paul Butler, thank you so much for talking to us today.

Paul Butler: It’s always a pleasure, Walter.

Virginia Heffernan: All right, let’s shift gears a little bit and hear about a sort of limit case, a worst-case scenario, the dystopian world of Chinese government surveillance. For that, we interviewed Maya Wang. Her title is senior China researcher at Human Rights Watch. Just as an aside, we interviewed both Jake and Maya by Zoom and they had something in common. They both had their cameras off when we interviewed them.

Walt Shaub: Hey, it was a vivid reminder of how hard it is to protect yourself against this kind of mass surveillance.

Virginia Heffernan: Yeah. It was interesting to see that. You and I have our pictures out there everywhere. I guess it’s too late for us, but good for Maya and Jake. Let’s hear what Maya has to say.

Walt Shaub: So, Maya, how pervasive is government surveillance in China?

Maya Wang: It is very pervasive. The Chinese police has — or is constructing — a very comprehensive multi-layered surveillance system that blankets much of the country with essentially an explicit political goal, which is to ensure that the Chinese communist party can rule forever.

And of course, I mean, the police also perform many different kinds of functions, like police elsewhere. But the surveillance infrastructure in China has an explicit political goal, which is to ensure what they call “social stability,” which is kind of the shorthand for ensuring that Communist party’s grip on power in China.

Walt Shaub: And how is facial recognition being used?

Maya Wang: I think often people — I think because there has been a lot of reporting about facial recognition in China, and because facial recognition is also a technology being used elsewhere in the world, and because it is such a visible form of surveillance, I mean, in the form of, you know, cameras in public places — I think it evokes a very kind of strong feeling among people around the world.

And there has been strong pushback against the use of facial recognition across the world, particularly in democratic countries. At least, well, people should be pushing back against them. But at the same time, I have to say that facial recognition really is only one part or one pillar of China’s mass surveillance systems, in which there are many parts.

Obviously, the government has put in place many kinds of these camera systems around the place, primarily in urban areas, but also increasingly in rural areas. And these cameras are now equipped with these artificial intelligence analytical capabilities. And that means that, you know, these systems are designed to recognize certain things in the visual world. So, in the past, the difficulty of surveillance camera systems is that generally, if you’re, say, looking for someone in the video stream, you would have to take, like, you would have to spend a lot of your officer hours watching that surveillance footage to find someone, right?

So, what these cameras, these new camera systems equipped with artificial intelligence, AI, do is that they are meant to recognize meaningful elements in the visual field to aid police work. And facial recognition is one of that type of analysis, right? It’s to recognize the people in the visual field. However, that’s not all that is actually being done, and other analytics are being conducted. And, for example, the surveillance systems are able to recognize color, right? Able to recognize objects, direction, whether or not a crowd is unusual, like moving in an unusual manner. You know, license plates, whether or not vehicles are involved in an accident.

And what that also means is that the idea is to break the complicated visual information into, essentially, almost like a text where you can then search through this information in a much quicker manner. So, let’s say I’m looking for someone who wears a red shirt going the direction north, towards the north of the city. Ideally these systems would be able to actually search through the video surveillance footage all across the city to find that person that fits a description.

So that’s one way in which this is used. The other way in which these systems are used is, for example, certain lists of individuals authorities may be looking for.

Walt Shaub: I think I saw that one of the items was even just something as simple as growing a beard.

Maya Wang: Yes.

Walt Shaub: And so, when you draw their attention like this, and you flag some criteria that somebody in an office somewhere decided is suspicious, what do, what do they do with you? I mean, you mentioned they go to jail. Is that only if they can prove you’ve committed a crime?

Maya Wang: No.

Walt Shaub: Or, or even if they just think you’re a potential threat to the party, because you’re suspicious?

Maya Wang: It’s the latter. We are talking about a government that has no rule of law or, or that the government, the Chinese communist party essentially controls the rule of law, uses the rule of law to achieve a political goal married with its surveillance capabilities.

Then you have these extreme cases like in Xinjiang, where people are being imprisoned for growing a beard. It just sounds incredible, right? But from the government’s perspective, growing a beard is a sign of your religiosity. The fact that you are a Muslim, and Muslims are generally in Xinjiang considered by the government as potentially anti-government. And, thus, having a beard would put you in prison. And, and it is something that happens. And I think that we don’t — it’s dystopian, of course — and there is no way to challenge that, right?

Like in normal times, you’d be like, “That makes no sense.” But it’s a bit like, you know, you have in a situation where you don’t have any rule of law. Lawyers will not represent you in Xinjiang, because these are all state security crimes. You, even, if you — people who are put in these political education camps, they do not go through a trial whatsoever. There’s no charge. But if you do get placed in, you know, in the criminal procedure process, there is some kind of very expedited trial. But like I said, no lawyers would in their right mind actually represent you in a way that would be recognized as the rule of law, right?

This is not to say that the Chinese government surveillance capabilities are in any way sophisticated, right? This is intrusive and abusive because they can do that to people, like you can say, “Well, a system that purported to be fighting crime is actually just fighting people who grow beards.” Right?

Walt Shaub: Right.

Maya Wang: It makes— it is not, it is not an efficient system. It is not even the right system. And yet they could do that because they’re not ever held accountable for wildly inaccurate claims. The idea is that, through understanding these general patterns, the police in China are able to kind of catch and predict these kind of, what they consider as crimes, including political crimes. These are forms of what the authorities call “predictive policing,” which they are very inspired by what’s going on in the U.S. Many of these systems draw their inspiration from the U.S. and the U.K. in their policing tactics and so on.

Walt Shaub: I wonder if there are lessons the West, and particularly the U.S., should learn from China’s experience and about what we should avoid to prevent this from becoming our future too.

Maya Wang: The key is to understand that facial recognition is just one form of surveillance. The, I think the danger is integration of data because, of course, — Sorry, I should also say that, like I said, identifiers of yourself: There are many different forms, right? Some are harder to change, rather than others. So, your face is obviously an important part of that identifier that you really should be protected in more — as sensitive information.

So, there should be laws that really tightly regulate the use of surveillance, both by the government and by private entities that meet international human rights standards as I just outlined. The government’s gathering of information needs to be necessary and proportionate. What we need to watch out for is the ability for any entity to integrate large amounts of data, including your sensitive, identifying information, making it impossible to escape surveillance for different purposes, whether or not we are talking about surveillance capitalism, or, you know, surveillance for political goals.

We are in a world where also the context in which we’re in is very important, right? In China, you have the same, essentially you have the same technologies, but the systems are different, right? We’re talking about integration of data. We are talking about a government that is completely unaccountable, an authoritarian one, that uses a system designed to, to make sure these systems work, to essentially ensure that it will rule forever. But in a democratic society, what you really have to worry about is to make sure that we don’t kind of sleepwalk into that kind of situation.

[music]

Virginia Heffernan: Oh, that was just chilling. It goes to show how arbitrary things can get when there are no limits on government surveillance. I mean, well, can you imagine being put on a list because you used the back door to go in and out of your home or because you grew a beard or stopped using a smartphone?

Walt Shaub: I think authoritarian regimes tend to fall into a state of paranoia trying to hold onto power everywhere. The leaders operate in a bubble. And I honestly think the isolation makes them weird.

Virginia Heffernan: Yeah. I mean, humans do get weird in isolation. We kind of saw that during the pandemic. I think I got weird. I also found it a little unsettling, what Maya said, that the Chinese were learning lessons about how to do this surveillance from the U.S. and the U.K.

Walt Shaub: Yeah. That one really turns some of my assumptions on their head.

Virginia Heffernan: I, I think it’s a reminder that we need to be vigilant, which is kind of an odd thought because we’re talking about being vigilant personally vigilant, civically about government vigilance, but we really do need the public to understand what can happen if surveillance is carried out to the limit.

Walt Shaub: You know, hearing Jake talk really reminded me about all of that surveillance the government did of Muslim communities in the 2000s and just the sheer excesses of it and the prejudice that drove it. It was the Patriot Act that really opened up doors for new abusive law enforcement techniques, and it normalized that type of conduct.

Virginia Heffernan: I mean, at the same time, don’t you think that when the government abuses this surveillance technology, that, that it actually generates public interest in the subject, although I’m not sure the public ever really grappled with the prejudice that drove the targeted abuses in the 2000s. But people may have started getting a little worried about their own privacy, especially after learning stuff like how the NSA’s bulk metadata collection program collected information, even about our phone calls.

Walt Shaub: That one led to an interesting development. In 2015, privacy advocates had, I guess I’d call it a notable win, when Congress passed the USA Freedom Act. It banned bulk collection by requiring that phone requests be targeted by including, quote, “a specific selection term” such as a name or a phone number.

Virginia Heffernan: So, it sounds like things may not be entirely hopeless.

Walt Shaub: Definitely not. It’s an ongoing struggle, and privacy advocates, including POGO and many other groups are going to have to keep fighting to curb the government’s authority in this area. And the stakes are really high, as we saw with Maya Wang’s talk about what’s going on in China. And the thought that that could be our future just runs my blood cold.

I think my biggest concern is the equity issues. The government targeting in particular of Black people and others for heightened scrutiny. So, I think this is an area where the public needs more awareness and needs to resist the temptation to shrug it off. But I think if they do, there’s real potential for reining in government surveillance. And I think that privacy advocates are making a difference. My last word on this would be that privacy matters because it’s a human rights issue.

Virginia Heffernan: Yeah. That was something that came through loud and clear from the interviews today. So that’s it for this episode of The Continuous Action. We’ll be back with a look at the growth of presidential power and the need to rein it in. The Continuous Action is hosted by me, Virginia Heffernan and Walter Shaub, produced by Myron Kaplan with help from Bubba Bach.

[music]