Bonus: China’s Surveillance State

May 13, 2022

This bonus episode features our full interview with Maya Wang, the senior China researcher for Human Rights Watch. As Maya explains, a comprehensive, multi-layered surveillance system blankets much of China with one primary goal: to ensure that the Chinese communist party can rule forever. We included an excerpt of this interview in Episode 3, “The Eyes On Your Face.” But the rest of Wang’s harrowing account of this surveillance system was too compelling to leave on the cutting room floor. Join us for a deeper dive into China’s surveillance state.

The Continuous Action is sponsored by The Project On Government Oversight.

Stay tuned on the latest from POGO, and don't forget to subscribe to The Continuous Action on Apple, Spotify, Stitcher, or Acast.

Our guest, Maya Wang, emphasized the breadth of surveillance in China:

Maya Wang [02:21]: The surveillance infrastructure in China has an explicit political goal, which is to ensure what they call “social stability,” which is kind of the shorthand for ensuring that communist party’s grip on power in China.
Maya Wang [13:39]: Then you have these extreme cases like Xinjiang, where people are being imprisoned for growing a beard. It just sounds incredible, right? But from the government’s perspective, growing a beard is a sign of your religiosity. The fact that you are a Muslim, and Muslims are generally in Xinjiang considered by the government as potentially anti-government. And, thus, having a beard would put you in prison. And, and it is something that happens.
Maya Wang [33:58]: There should be laws that really tightly regulate the use of surveillance, both by the government and by private entities that meet international human rights standards, as I just outlined, that they have to be necessary and proportionate. The government’s gathering of information needs to be necessary and proportionate.

Virginia Heffernan: Hey, Walt!

Walt Shaub: What’s up, Virginia Heffernan?

Virginia Heffernan: Maya Wang is so great. I think we should just post her full interview as bonus material.

Walt Shaub: Let’s do it.

Maya Wang: My name is Maya Wang and I’m a senior China researcher at Human Rights Watch.

[music]

Walt Shaub: Hi folks. This is Walt Shaub. Virginia Heffernan and I have been co-hosting The Continuous Action, a new podcast sponsored by the Project On Government Oversight, a nonpartisan, independent government watchdog, and we’ve been exploring issues confronting democracy. In episode three, we explored the issue of government surveillance. One of the experts we talked to was Maya Wang, and the interview of Maya was so fascinating that we just kept talking to her for about 40 minutes. And obviously we couldn’t fit all of that into episode three of the podcast. So, we thought we would post the full interview for you as bonus content. I think you’re really going to enjoy it. Maya takes us deep into the dystopian world of surveillance in China, sort of a worst-case scenario for a future we hope to avoid. And Maya has some advice for exactly how we avoid it. Hopefully you’ve been enjoying the podcast and episode three was all you hoped it would be, but if you’re craving more, you’re going to love this interview with Maya Wang. Let’s listen to it right now.

Walt Shaub: How pervasive is government surveillance in China?

Maya Wang: It is very pervasive. The Chinese police has — or is constructing — a very comprehensive, multi-layered surveillance system that blankets much of the country with essentially an explicit political goal, which is to ensure that the Chinese communist party can rule forever.

And of course, I mean, the police also perform many different kinds of functions, like police elsewhere. But the surveillance infrastructure in China has an explicit political goal, which is to ensure what they call “social stability,” which is kind of the shorthand for ensuring that Communist party’s grip on power in China.

Walt Shaub: And how is facial recognition being used?

Maya Wang: I think often people — I think because there has been a lot of reporting about facial recognition in China, and because facial recognition is also a technology being used elsewhere in the world, and because it is such a visible form of surveillance, I mean, in the form of, you know, cameras in public places — I think it evokes a very kind of strong feeling among people around the world.

And there has been strong pushback against the use of facial recognition across the world, particularly in democratic countries. At least, well, people should be pushing back against them. But at the same time, I have to say that facial recognition really is only one part or one pillar of China’s mass surveillance systems, in which there are many parts. And I will explain how that is being used.

Obviously, the government has put in place many kinds of these camera systems around the place, primarily in urban areas, but also increasingly in rural areas.

And these cameras are now equipped with these artificial intelligence analytical capabilities. And that means that, you know, these systems are designed to recognize certain things in the visual world. So, in the past, the difficulty of surveillance camera systems is that generally, if you’re, say, looking for someone in the video stream, you would have to take, like, you would have to spend a lot of your officer hours watching that surveillance footage to find someone, right?

So, what these cameras, these new camera systems equipped with artificial intelligence, AI, do is that they are meant to recognize meaningful elements in the visual field to aid police work. And facial recognition is one of that type of analysis, right? It’s to recognize the people in the visual field. However, that’s not all that is actually being done, and other analytics are being conducted. And, for example, the surveillance systems are able to recognize color, right? Able to recognize objects, direction, crowds— whether or not a crowd is unusual, like moving in an unusual manner. You know, license plates, whether or not vehicles are involved in an accident.

And what that also means is that the idea is to break the complicated visual information into, essentially, almost like a text where you can then search through this information in a much quicker manner. So, let’s say I’m looking for someone who wears a red shirt going the direction north, towards the north of the city. Ideally these systems would be able to actually search through the video surveillance footage all across the city to find that person that fits a description.

So that’s one way in which this is used. And another way, like I was saying that, you know, facial recognition is — going back to your original question, the other way in which these systems are used is, for example, certain lists of individuals authorities may be looking for.

So, there are lists of people that the police are looking for, for various what the authorities call “crime.” There are some crimes that are, you know, real crimes. And there are also — because the Chinese police are tasked with catching political criminals — people who are merely dissidents, or activists, or petitioners, people complaining about government misconduct. These camera systems can be used through facial recognition or other analytics to catch these — people who have, who are merely kinds of opponents to the regime, along with other people that the police want to catch.

But this is just one way in which facial recognition or camera systems can be used. And another way is how these camera systems are integrated into policing surveillance systems, which are much bigger than just the use of camera systems.

Walt Shaub: One of the things that really struck me was reading an article about how your group, Human Rights Watch, reverse-engineered a cell phone app that police use to capture information about people they encounter and record it in a centralized system. I was amazed at the type of things they were capturing. Can you tell us a little bit about that?

Maya Wang: Sure. So, the app we reverse-engineered is, or was, used in the region of Xinjiang in northwestern China, where — it is a region that, in a country with deep human rights abuses, Xinjiang is a region with even worse human rights abuses. Because it is a region where over — the majority of the residents are ethnic minorities who are Muslims, many of whom are Muslims.

And the authorities long kind of considered them as problematic because they are so different. The government has used the language of counterterrorism, essentially kind of capitalizing on the U.S. war on terror to label these groups of minorities — like I said, some of whom happen to be practicing Muslims — to characterize them, essentially the whole group, as terrorists and using that kind of excuse to particularly deprive them of a range of rights — even more so than, you know, the rest of China.

And so it was in this context that the use of surveillance is being imposed with greater severity on Xinjiang. And, as part of that, these surveillance systems — in which there are many —one of them, a central part of that is called the “Integrated Joint Operations Platform,” IJOP. And the IJOP is essentially kind of like the brain, or the big data brain, of these surveillance systems.

And they’re connected to many different things around the region, including the camera systems, which is — imagine that if IJOP at the backend is like the brain of the surveillance systems, the camera systems are kind of like the sensory nodes or the sensory systems of the brain, right? And then other sensory systems across the region are things like data doors. When people pass through checkpoints — and they’re required to pass through checkpoints in the cities, throughout the cities, when they enter and leave villages along the highways — everywhere they’re required to pass through these checkpoints, some of which are essentially equipped with this kind of technology that recognizes your face, but also surreptitiously collecting information.

Like the Mac address of the — which is identifying information of your phones and computers passing through as you walk through these doors, and other devices throughout the city also collect your identifying information of yourself, like I said, your, — the Mac address of your devices, but also IMEI [International Mobile Equipment Identity] numbers and license plate numbers. All of the information is integrated and analyzed at the backend, using big data systems.

And what that means is that, for example, if you drive a car that doesn’t belong to you; if you go to the gas station, quote, “too many times a day”; if you, if you have a phone that suddenly, you know, go off grid, like it hasn’t been used for many days or, or that the phone suddenly is no longer connected to the network; or if you call abroad — then the system picks you up as an abnormal signal in the system and will dispatch the officer nearby to interrogate you, for, to see whether or not you are somewhat suspicious.

Following which, these people, if they are considered suspicious by the government officials, they can be sent to political education camps where they are arbitrarily detained without charge or trial. Or they are sent to, they could be sent to prison, in which they face very lengthy imprisonment for over a decade of time. And the app, which is IJOP app, is connected to the backend system. And it is used as a kind of a communication tool where the, essentially the government officials — primarily the police — have these phones, which has its app installed on it.

And we were able to get ahold of this app and reverse-engineer it to understand: What are the criteria where people are actually being considered kind of suspicious: behavior that can get them into trouble according to the IJOP system. And you’re right to say that these criteria seem very odd. Like if you, if you look inside the app, it’s telling you that people are being, they’re considered suspicious for the variety of reasons that I just outlined, but there are many more. For example, using too much electricity; having, entering your home through the back door instead of the front door; having donated to a mosque very enthusiastically; using WhatsApp; using VPN, virtual private networks, which are a way of what people in China would do. They would scale the firewall, essentially avoid the censorship and go and access information abroad using a virtual private network.

These kind of activities are all considered suspicious by the authorities. And we were able to understand that by reverse-engineering the app, which is to say that we look at the codes of the app to see what the app actually is telling the police officers are suspicious activities.

[music]

Walt Shaub: I think I saw that one of the items was even just something as simple as growing a beard.

Maya Wang: Yes.

Walt Shaub: And so, when you draw their attention like this, and you flag some criteria that somebody in an office somewhere decided is suspicious, what do, what do they do with you? I mean, you mentioned they go to jail. Is that only if they can prove you’ve committed a crime?

Maya Wang: No.

Walt Shaub: Or, or even if they just think you’re a potential threat to the party, because you’re suspicious?

Maya Wang: It’s the latter. If you, the — the problem of a surveillance state married with what is essentially, well, I mean, we are talking about a government that has no rule of law or, or that the government, the Chinese communist party essentially controls the rule of law, uses the rule of law to achieve a political goal married with its surveillance capabilities.

Then you have these extreme cases like in Xinjiang, where people are being imprisoned for growing a beard. It just sounds incredible, right? But from the government’s perspective, growing a beard is a sign of your religiosity. The fact that you are a Muslim, and Muslims are generally in Xinjiang considered by the government as potentially anti-government. And, thus, having a beard would put you in prison. And, and it is something that happens. And I think that we don’t — it’s dystopian, of course — and there is no way to challenge that, right?

Like in normal times, you’d be like, “That makes no sense.” But it’s a bit like, you know, you have in a situation where you don’t have any rule of law. Lawyers will not represent you in Xinjiang, because these are all state security crimes. You, even if you — people who are put in these political education camps, they do not go through a trial whatsoever. There’s no charge. But if you do get placed in, you know, in the criminal procedure process, there is some kind of very expedited trial. But like I said, no lawyers would in their right mind actually represent you in a way that would be recognized as the rule of law, right?

Like, you know, lawyers go there and this performer and you are going to be imprisoned. And we have a lot of cases where people are being imprisoned for having attended a prayer session for their neighbor, because the neighbor died and she, an elderly woman would get six years in jail. And other men listened to a religious recording sent by his daughter, because of course, people listen to religious recordings, prayers and such, and he would be, and he, he was imprisoned. And so was the rest of his family, because they circulated a prayer, this is what we are talking about in the absence of any kind of ability to push back against government surveillance.

This is not to say that the Chinese government surveillance capabilities are in any way sophisticated, right? This is intrusive and abusive because they can do that to people, like you can say, “Well, a system that purported to be fighting crime is actually just fighting people who grow beards.” Right?

It makes — it is not, it is not an efficient system. It is not even the right system. And yet they could do that because they’re not ever held accountable for wildly inaccurate claims.

Walt Shaub: I’ve read such horror reports about these so-called reeducation camps. Is, is that the end product of all of this surveillance in that particular province?

Maya Wang: There are many reasons in which people are being subjected to these political education camps, and not all of them were vetted, essentially, through these surveillance systems. And you have to remember that surveillance in China, in addition to the use of technology, is as much about the people who conduct the surveillance, who decide who to send to these camps. Who think, who are — increasingly, I think, there’s some evidence that shows that officials have been given quotas along the way. They have to catch, let’s say, a hundred people out of a village of 5,000 to put in camps, because that’s about how many people have, who have anti-government thoughts. So, so it’s not just about this use of technology; it’s as much the Chinese government has a very long history of conducting, kind of, essentially people-to-people surveillance, so as to speak.

Walt Shaub: And are they tracking relationships as well? I mean, are they essentially finding guilt by association because you’re friends with somebody they are suspicious of?

Maya Wang: Back in. So, this — the app in Xinjiang, we reverse-engineered in 2019, if I recall correctly. Or we published it in 2019. And the app itself, I obtained it in a version I think was used around 2017. So, you could say that it’s five years old by now. And at the time, the kind of relationship they tracked was, for example, whether or not you have any connections abroad, and particularly they considered 26 countries as being — 26 sensitive countries. And these are countries like Turkey, Saudi Arabia, Afghanistan, and Indonesia. You can see in a pattern here: These are countries with Muslim-majority populations. So, anyone with personal relationships to these countries, for example, if they have an uncle, a brother; if you call someone in these countries; if you send money to these countries, then you could be picked up by these surveillance systems and subjected to arbitrary detention, as I described.

Walt Shaub: It sounds like the surveillance is the most intense in this Xinjiang province. Is, is it happening all over China as well?

Maya Wang: Absolutely. I think the surveillance is most intrusive, visible, in Xinjiang, in the way that you can really draw a straight line between the use of surveillance all the way to, you know, imprisonment. Often, you know, people who study surveillance, especially in civil society in democracies, the challenge is often drawing that line, right?

You know, lots of proponents of surveillance would say, “Oh, well, like, you know, it’s not as bad as you think.” And — but if you look at the more extreme end of surveillance, government surveillance in places like Xinjiang, you can really draw a straight line.

Elsewhere in China, though, it’s not as straightforward. Because, of course, government surveillance often is also very hidden. So, people are not actually told that they were, you know, being subjected to surveillance and that’s why they’re being put in prison.

They’re not often told. And we were just lucky to find many kind of forms of evidence that authorities weren’t really hiding in Xinjiang because they didn’t think anybody were looking.

In the rest of China, you can still find a lot of evidence, but there is less of a straight line. Because in the rest of China, you don’t have the, the political education camps that were very much part of the repression of Muslims, of Turkish Muslims, in the Xinjiang region. So, we could see that very clearly. In the rest of China, though, you have similar systems. You have — and I haven’t actually described how surveillance systems work in China.

At the very basic level, there is the requirement that everybody in China has to have an ID card and a number. And the number is used in accessing many different forms of public, and even private, services. And that kind of provides the authorities some ability, you know, good ability to track you in various spaces. And then on top of that, there's a requirement of “real name registration” — that when you take long distance buses or trains, you have to use your ID number that provides, you know, the ability to track your transportation, your movement.

But in addition to that, when you use your SIM card — and that, your phone essentially has become many people’s identifier through time, right, in many places. And that also is connected to your phone, your phone is connected to your IMEI number, and your Mac address. All of that is connected, and sensory systems throughout your environment that are then kind of fed into big data systems.

So, like, I said earlier that, you know, your face is definitely an identifier, right, like your phone’s Mac address and IMEI number. Some of your identifiers are less easily altered. You can’t really change your face. You, you could if you’re a professional criminal, like there’re also 3D masks now which you could print. Yeah.

Walt Shaub: I’ve got to get one of those.

Maya Wang: In case anyone is wondering or, you know, facial recognition, anti-facial recognition glasses, and so on.

But at any rate, your face is generally very hard to be changed meaningfully to avoid these systems. But the idea is that when you are, when there are many, kind of, different forms of identifiers — your IP address and all of that — and it’s kind of floating through these systems, right? And they’re being collected in various, different ways, and they’re being integrated into these spectator systems to figure out whether or not there are abnormalities — how people are related; where are they going? Who are they traveling with? Which network devices go with another one?

Now, these kinds of relationship information, like you were just asking, along with, modeling, like computer modeling. Okay. So, let’s say someone who, who tends to be, you know, scamming online would generally have this kind of pattern of bank activity. Or someone who is a political activist generally has another set of personal relationships or something.

The idea is that, through understanding these general patterns, the police in China are able to kind of catch and predict these kind of, what they consider as crimes, including political crimes. These are forms of what the authorities call “predictive policing,” which they are very inspired by what’s going on in the U.S. Many of these systems draw their inspiration from the U.S. and the U.K. and their policing tactics and so on.

But the, these predictive policing systems are being developed right now in recent years, much more sophisticated in the intention and aspirations than, you know, in 2017, when we documented the Integrated Joint Operations Platform. If you remember, the Integrated Joint Operations Platform was more, kind of, like, “If someone grows a beard, that person is suspicious. Let’s go and investigate that person.” That’s like an A, B, and C kind of approach — straight line.

But the predictive policing system they are trying to develop is now a lot more about patterns of activities, and relationships, and networks, and so on that the government is really trying to do. However, I would — now, this is also very important to remember — is that the Chinese government and the Chinese police are trying very hard. There’s a lot of, also, marketing materials out there by these companies that say, “Oh, we can do predictive policing and blah, blah, blah.” But in actual practice, sources of information is not so easy to be integrated. The police continue to struggle over integration of data and complain about silos of information.

Big data is also not a very — you know, well, it’s also not very, kind of, accurate. Or, you know, you also need to have good quality information from the bottom through on up. And when it requires government police officers to be very diligent in collecting that information, and when they are really not that diligent, then you have, you know, much less capabilities of using — these systems actually don’t get used.

So, this is to say that I think the Chinese government certainly has these intentions. They certainly will develop these kind of capabilities. Whether or not they’re able to use them, whether or not they’re actually accomplishing their goals are big questions. But the problem is, whether or not they’re actually accomplishing them is a different question from whether or not they will try to use them even incorrectly, like in the cases of Xinjiang, where — either way, these are abusive systems.

Walt Shaub: And so, what, what are they trying to predict with predictive technology? Are they trying to anticipate somebody who may become an opponent of the state?

Maya Wang: They, they do try to apply it, these predictive policing capabilities, on some real crimes. So, not just about catching political activists. In fact, the problem here is that you do — well, I mean, you do have real crimes in China that police have to also try to catch, but the problem is when they are such abusive environments in which these systems are situated, then you have essentially this infinite expansion of surveillance capabilities on catching even, like, the most common crimes.

So, from a human rights point of view, going back to the basics, government can collect information about citizens, right? You know, like a very classic example is the hospitals have to know what your medical record was — is — to provide care that you need. That is a straightforward, like you would give consent; you know that data stays in a hospital and nowhere else; it certainly doesn’t go to, say, the police — it shouldn’t go to the police, for example, or any other kind of aspects of … to other entities that has nothing to do with your care.

But collection of personal information should be necessary, proportionate, and lawful, according to international human rights law. And that means they have to be necessary for the purpose that they are supposed to achieve. Proportional to the goal. Like, let’s say for if you’re protecting public security, if you, say you’re catching someone for a terrorist crime, it actually has to be very proportional. Like, you can’t blanket — you shouldn’t blanket technical society with surveillance devices because you don’t want anyone to steal cigarettes from the convenience store. That would be wildly disproportionate surveillance, and unlawful.

There are laws that very specifically lay out what is proportional, what is necessary. Are there other means that could accomplish similar goals without dramatic abuses of rights? So, so let’s say you have a problem with cigarette theft. The solution is not to, you know, make sure that whole city is blanketed with surveillance devices. But you are actually going to, I don’t know, like maybe have some kind of sting operation, maybe educate the youth, or like, whatever —

Walt Shaub: Right, right.

Maya Wang: But these — these are rights consistent —

Walt Shaub: But not pull out the satellite technology for a pack of cigarettes.

Maya Wang: Exactly, exactly. So that would be — you know, the law would say, “In these very serious crimes, these kind of situations … and there is, you know, restrictions on sharing of information, blah, blah, blah.” But the problem here in China is there is very little restraint on police power. Now, there is now a series of laws in China, and regulations. One of them is actually protecting private, personal information, which is new and certainly a positive step. But most of — the intention of the law is primarily to reign in internet platforms, private companies, not the government. Although government is — there are some provisions in there that theoretically should also regulate government data sharing and so on. But given how powerful Chinese police is, these laws will [be] unlikely to really, you know, restrain them, when the goal of policing is to integrate these sources of information to provide intelligence for their policing.

So, the problem is that essentially, the police can do whatever they want. They can integrate many different sources of data across their domain of policing and government, and then across different regions of China too. Then, when, on top of that, you have very little accountability — like, say police issue a warrant for, for someone — there’s like no court involvement. You don’t have to go to court for … the police don’t have to go to court to apply for anything. They just have to, like, apply within the police. Theoretically, you know, upper-level police officers should exercise some level of supervision, but you know, how likely would that really be when the police goal is to shore up the party’s power?

So, essentially, we’re talking about a very abusive environment where, you know — I’m sure people are familiar with the idea of mission creep, right? So, surveillance systems, sometimes it’s developed, you know, “It’s about terrorists and torture.” You know, there was this, always this classic question: “If someone is gonna bomb, plant some bomb, are you allowed to, like, torture this person?” Like, say, this classical ethical dilemma, right?

And a surveillance system is a little bit like that. Where you say, OK. The idea is, people imagine it’s all about this, like, serious crime — that if we don’t use these really kind of abusive tactics, we won’t solve them. But then what happens is, actually, that they become increasingly common, and they become applied to all kinds of situations, on people who are immigrants, for example, people who have no power to resist.

And, you know, to say that China is, has, has a surveillance mission creep is perhaps not accurate. Because these systems are designed for being essentially kind of like the kitchen sink: You throw every single problem of policing into it in, into the surveillance kitchen sink. And then, well, because the whole society is essentially surveilled, and so “Why not? Let’s, let’s try to catch a thief, or who have stolen some little thing from a shop — to catch the person.”

So, and this is where the problem is, is then the whole society is surveilled. And every single kind — and many different kinds of crimes —are then resorted to that you could use surveillance, kind of the mass surveillance system, repurpose the mass surveillance system for those purposes.

[music]

Walt Shaub: I think that’s really a good point about mission creep because, even though China’s goal may be to use it as expansively as possible, it’s something for us to worry about in the West is: If we give our government some room to, to surveil the population, how far do they take it? And what kind of mission creep do they experience? And I think one thing you said that really gave me the chills was that China was adopting some of its techniques from watching what the U.S. and the U.K. are doing. I had always imagined it was the other way around, but the thought that that they’re actually drawing lessons from us...

I wonder if, and I think this may be my last question, but I wonder if there are lessons the West and, and particularly the U.S. should learn from China’s experience, and about what we should avoid to, to prevent this from becoming our future too.

Maya Wang: First of all, some positive aspects of this comparison is that what China is trying to accomplish — and like I said, they are, you know, far from being able to accomplish their aspirations because of all the reasons I was explaining — but they they’re able to be having the level of surveillance they have right now through many years of paving the way towards it, right?

You need a national ID number. You need to be using requiring people to use real name to access services. You need to essentially have a vacuum. Like you, you need to have no laws to regulate the police. You need to try to integrate sources of data and in, in places like the U.S. for example, you have many different forces that mutate against these kinds of systems, right?

The key is to understand that facial recognition is just one form of surveillance. The, I think the danger is integration of data because, of course — sorry, I should also say that, like I said, identifiers of yourself: There are many different forms, right? Some are harder to change, rather than others. So, your face is obviously an important part of that identifier that you really should be protected in more — as sensitive information.

So, there should be laws that really tightly regulate the use of surveillance, both by the government and by private entities that meet international human rights standards, as I just outlined, that they have to be necessary and proportionate. The government’s gathering of information needs to be necessary and proportionate.

Now, the U.S. doesn’t have any federal privacy laws, and it still lags behind many other developed countries when it comes to protection of people’s privacy. That leaves Americans vulnerable to surveillance of many forms, not just by the government, by the police, but also by large platforms, which actually, outside of China, these companies are — have ability to integrate your data actually in a way that I think no other entities are quite capable of doing. So, we are talking about some of these internet platforms that are gathering your internet browsing history, you know, your shopping records, your IP address, your friends, your relationship networks. Some of them are even branching out to your DNA.

And when that information can be cross referenced — location data is another big one— then you have a problem, where it is very difficult or impossible to escape that kind of surveillance. Of course, you know, these companies are nothing like, you know, the Chinese police, essentially, when they also have the coercive instruments of the military and the police. We are not talking about Western internet platforms having those coercive powers.

But nonetheless, what we need to watch out for is the ability for any entity to integrate large amounts of data, including your sensitive, identifying information, making it impossible to escape surveillance for different purposes, whether or not we are talking about surveillance capitalism, or, you know, surveillance for political goals.

The context in which we’re in is very important, right? In China, you have the same, essentially you have the same technologies, but the systems are different, right? We’re talking about integration of data. We are talking about a government that is completely unaccountable, an authoritarian one, that uses a system designed to, to make sure these systems work, to essentially ensure that it will rule forever. But in a democratic society, what you really have to worry about is to make sure that we don’t kind of sleepwalk into that kind of situation.

The protection of personal information, to me, is almost like a collection of pillars to democracy. As you, as we said, in the U.S., the U.S. is also experiencing quite a backsliding on democracy, right? It’s not so hard to imagine that as we, as the U.S., say, elects someone who has authoritarian tendencies, who are going to take away pillars of democracy, who are going to rig elections.

And the next thing is, too, that you already have police forces that have some of these surveillance systems. Integrate them. Make them go after dissidents and activists. And over time, maybe not immediately, you could have an authoritarian government here in the U.S. And I think that is a very serious question, is that you have to guard against all of these elements to make democracy stronger, so that you do not become that kind of Orwellian society where kind of the boot is permanently on the face of the human race.

But that’s not to say that tomorrow you flip the switch and, and the society becomes, you know, the kind of like surveillance state. I don’t think that the U.S. is there yet. Also — because I’m not an expert in the U.S. — but also, as I understand it, these surveillance systems, although quite powerful, you know, resting in certain police forces or the NSA, they are, you know, quite fragmented. And they also have laws they have to follow in a way that is — these laws are so much stronger and at least compared to what would be, you know, in China, which is no law at all restraining police power.

Walt Shaub: Well, thank you so much, Maya, this has been wonderful talking to you in such an education. And that’s it for this bonus content from The Continuous Action. If you haven’t listened to episode three of The Continuous Action, check it out. It features Maya Wang, Paul Butler, and Jake Lapperuque talking about surveillance, and in particular facial recognition technology, in the United States and abroad.

The Continuous Action is hosted by me, Walt Shaub, and Virginia Heffernan. The main episodes are produced by Myron Kaplan with help from Bubba Bach; however, this bonus content was produced in-house by POGO. Music was supplied by Sonic Sanctuary through Shutterstock. And The Continuous Action, along with this bonus content, is created by the Project On Government Oversight — POGO.

[music]