Office of Science and Technology Policy
Executive Office of the President, Eisenhower Executive Office Building
1650 Pennsylvania Avenue
Washington, DC 20504
Via email: [email protected]
Subject: Comment in Response to Request for Information (RFI) on Public and Private Sector Uses of Biometric Technologies
To the Office of Science and Technology Policy:
Thank you for the opportunity to submit comments to the White House Office of Science and Technology Policy regarding use of biometric technologies for the purposes of identity verification, identification of individuals, and inference of attributes. Law enforcement use of face recognition involves the risk of misidentification as well as misuse — each of which causes serious harm to civil rights, civil liberties, and public welfare.
Our comment will focus on one biometric identification technology — face recognition — that is already used on a large scale by local, state, and federal law enforcement. Our comment describes the multitude of dangers face recognition, as currently deployed, poses to civil rights and civil liberties, examines how current federal government policies contribute to this danger, and recommends a series of White House policies to reduce the harm face recognition causes.
Founded in 1981, the Project On Government Oversight (POGO) is a nonpartisan independent watchdog that investigates and exposes waste, corruption, abuse of power, and when the government fails to serve the public or silences those who report wrongdoing. We champion reforms to achieve a more effective, ethical, and accountable federal government that safeguards constitutional principles.
The Constitution Project at POGO centers its work on issues such as guarding against improper and overbroad surveillance, including unchecked face recognition. In 2019, The Constitution Project at POGO convened a face recognition task force of expert stakeholders to examine the impact of face recognition surveillance.1 The task force included academics, tech experts, civil rights and civil liberties advocates, and law enforcement officials. Our group concluded that any law enforcement use of face recognition should be subject to strong limits, and provided a set of policy recommendations to support legislatures in the creation of reasonable but necessary limits.
Law Enforcement’s Use of Face Recognition Is Already Significant
At least one in four state and local police departments have the capacity to run face recognition searches, either directly or through a partnering agency.2 According to a BuzzFeed News investigative report, over 1,800 publicly funded agencies have used Clearview AI — a face recognition system that is especially problematic for developing a photo database built on scraping social media without users’ consent.3
The FBI oversees a massive face recognition system through its Facial Analysis, Comparison, and Evaluation Services Unit, with capacity to scan hundreds of millions of photos, including nearly one out of every three drivers’ license photos.4 In addition to conducting face recognition scans for its own investigations, the FBI also employs its Next Generation Identification-Interstate Photo System to process requests for scans largely from state and local law enforcement.5 The FBI no longer discloses how many face recognition searches it runs, but it previously processed as many as 8,000 searches per month on average.6
Federal use of face recognition extends beyond the FBI. Sixteen federal agencies use face recognition, including half a dozen that employ it in criminal investigations.7 Notably, Immigration and Customs Enforcement has in recent years run thousands of face recognition searches of state drivers’ license databases,8 sometimes without oversight and approval of state officials.9 ICE’s access to face recognition appears to be a driving factor in the detention of immigrants.10
Face Recognition Misidentifications Cause Serious Harm to Civil Rights, Civil Liberties, and Public Welfare
One of the most acute risks of face recognition is its potential to misidentify individuals. In terms of law enforcement use, this could lead to incorrect identification of suspects, as well as wrongful arrest and incarceration.
Face Recognition Misidentifications Stem from a Variety of Causes, Many of Which Cannot Be Prevented
First, the quality of face recognition algorithms can vary significantly, causing acute harms to civil liberties. Notably, many algorithms misidentify women and people of color at a higher rate than other people. Studies by the National Institute of Standards and Technology; the Massachusetts Institute of Technology, Microsoft, and AI Now Institute researchers; the American Civil Liberties Union; and an FBI expert all concluded that face recognition systems misidentify women and people of color more frequently.11 Most recently, the National Institute of Standards and Technology found that some systems were 100 times more likely to misidentify people of East Asian and African descent than white people.12 Failure to recognize the significance of this problem — and account for it in selection and review of software, training, and auditing — will undermine investigations, seriously endanger civil rights, and undermine efforts to reduce systemic bias in policing and the criminal justice system.
Second, image quality can also significantly impact the accuracy of matches. Sets of reference images — databases containing previously identified faces — in face recognition systems are typically high-resolution photos of a person directly facing a camera at close range, such as for a mug shot photo. But probe images — from which law enforcement seeks to identify individuals — are derived from a wide range of situations, which creates the potential for low image quality and erroneous results. Bad lighting, indirect angles, distance, poor camera quality, and low image resolution all make misidentifications more likely.13 These poor image conditions are more common when photos and videos are taken in public, such as with a CCTV camera. But these low-quality images often serve as probe images for face recognition scans, without proper consideration for their diminished utility.14
Third, even when using more effective software and higher quality images, system settings can make face recognition matches prone to misidentification. For example, the way law enforcement sets confidence thresholds — a metric used to compare which proposed matches within a system are more likely to be accurate — can undermine the reliability of results. The lower the confidence threshold, the more likely the “match” is actually a false positive. So, if law enforcement entities set face recognition systems to always return potential matches — no matter how low confidence the threshold — they will receive untrustworthy data. Troublingly, some law enforcement entities, including the FBI, do just that.15
In the absence of safeguards to address this range of misidentification risks, face recognition will continue to provoke errors, harm innocent individuals, and exacerbate inequalities in how different communities are policed.
Using Face Recognition to Generate Leads Does Not Avoid Harms Such as Wrongful Arrests
Law enforcement officials supporting the use of face recognition, such as FBI Director Christopher Wray, downplay the dangers of misidentification by arguing that face recognition is just used for leads.16 But this ignores a basic fact: Leads can vary immensely in how reliable or delusive they are, as well as how much or how little they might impact the course of an investigation. Law enforcement has previously relied heavily on certain “scientific” forensic evidence techniques — techniques that have been touted as presumptively objective, consistent, and reliable — that were, in fact, highly misleading.17 As is the case with bite-mark analysis or lie detector tests,18 the fact that face recognition is merely used as a lead does not prevent it from producing errors that cause the arrest or incarceration of innocent individuals.
Individuals could be — and in numerous recorded cases have been — charged in part based on how a face recognition match affects the direction of an investigation early on. Law enforcement overconfidence in the accuracy of matches can promote confirmation bias and sloppy follow-up, limiting the ability to identify face recognition errors.19
It is also important to recognize that even if errors in face recognition systems are eventually discovered and accounted for, face recognition mismatches can form the basis of individuals becoming investigative targets. A variety of disruptive and potentially traumatic police actions can flow from such errors, such as being stopped, searched, monitored for prolonged periods of time, or detained and questioned. These harms will be disproportionately borne by people of color so long as algorithmic bias is present in face recognition systems, and more generally so long as systemic bias impacts policing and our criminal justice system.
It is critical when evaluating these harms to consider the real-world human cost. Take for example the impact that a faulty face recognition match had on Robert Williams, who was arrested at his home and subsequently held in custody for 30 hours:
During the time that I was in custody, my wife was dealing with the emotional and practical immediate fall out. My daughters were scared, wondering why their father had been arrested and whether he would come home. My wife had to comfort them. While I was away, our oldest daughter turned over a family photo that was sitting out on the family furniture because she couldn’t bear looking at a picture of her Daddy under the circumstances. My wife also had to call my employer and explain to them where I was and why I wouldn’t be coming to work that day. They could have fired me right then …. My daughters can’t unsee me being handcuffed and put into a police car. They continue to suffer that trauma. For example, after I returned from Jail, they started playing cops and robbers games where they tell me that I’m in jail for stealing. And even today, when my daughters encounter the coverage about what happened to me, they are reduced to tears by their memory of those awful days.20
Despite this traumatic experience, Williams describes himself as “lucky” that the harm he and his family unjustly experienced was not more severe.21
Finally, the notion that face recognition matches are merely leads that serve as one of many components of an investigation is often simply untrue. There are already three documented cases where individuals were wrongfully arrested — with two spending time in jail — based entirely on bad face recognition matches.22 According to a 2020 New York Times investigation of face recognition systems in Florida, “Although officials said investigators could not rely on facial recognition results to make an arrest, documents suggested that on occasion officers gathered no other evidence.”23 And because use of face recognition in investigations is often hidden from arrestees and defendants, there are likely many similar instances of face recognition being the sole basis for an arrest that remain hidden from the public.
Lack of Disclosure Augments Misidentification Risks and Undermines Due Process Rights
The risks of misidentification causing serious harm are increased by the fact that use of face recognition is often hidden from defendants. Given the wide range of technical factors that can impact face recognition’s effectiveness, it is critical that defendants are notified and given the opportunity to examine face recognition technology whenever it is used in an investigation, as they would with any other complex forensic tool.
Despite the importance of disclosure, it rarely occurs.24 In some jurisdictions, law enforcement uses facial recognition thousands of times per month, and defendants almost never receive notice of its use in investigations.25 Yet even as law enforcement relies on the technology for investigations, they obscure it from examination in court by defendants and judges.26
Defendants have a vested interest in reviewing a variety of factors, such as algorithm quality, the software settings police used, and whether any other potential matches were discovered or investigated that could provide exculpatory or mitigating evidence. This is key not only to protecting innocent individuals, but also to preserving constitutionally guaranteed due process rights of all defendants and promoting genuine public safety.
Furthermore, guaranteeing access to this information is not only critical for due process rights, but also acts as an important safeguard to deter corner cutting and inappropriate use of face recognition during investigations.
Face Recognition Also Creates Risks of Pervasive Surveillance that is Fundamentally Incompatible with Democratic Society
One of the most important aspects of face recognition is that even if the government could mitigate the dangers of misidentification, doing so would not lessen the dangers of the technology as a whole. Simply put, face recognition surveillance is dangerous when it does not work, but also dangerous in a different — yet equally important — way when it does.
In the digital age, privacy rights do not just protect the inside of our homes from improper intrusion; they are also critical safeguards from overbearing government power. As Justice Sonia Sotomayor warned about emerging surveillance technologies when the Supreme Court first examined the issue of electronic location tracking, “making available at a relatively low cost such a substantial quantum of intimate information about any person whom the government, in its unfettered discretion, chooses to track … may ‘alter the relationship between citizen and government in a way that is inimical to democratic society.’”27
If unrestrained in its use, face recognition surveillance offers so much information and power to government that it could upend basic rights and foundations of democracy. In authoritarian regimes such as China, face recognition is used as a means of social control, with the technology serving as a means of mass cataloging of activities and draconian enforcement of minor offenses.28 It has been weaponized for continuous surveillance and brutal oppression of its Uighur minority.29 It is used to identify, discourage, and detain pro-democracy protesters.30
The frightening abuse of face recognition technology is not limited to China. Officials in the U.S. have already used the technology to undermine fundamental features of democracy here.
According to a South Florida Sun Sentinel investigation, in 2020, law enforcement repeatedly used face recognition to identify and catalog peaceful protesters. Fort Lauderdale police ran numerous face recognition searches to identify people who might be a “possible protest organizer” or an “associate of protest organizer” at a peaceful Juneteenth event to promote defunding the police. Boca Raton police also ran face recognition scans on half a dozen occasions throughout May 2020 targeting protesters during peaceful events. And the Broward Sheriff’s Office ran nearly 20 face recognition searches during this same time period for the purpose of “intelligence” collection, rather than to investigate any criminal offense.31
Face recognition has also been used to selectively target individuals who are protesting, with law enforcement using the technology to rapidly scan protests for individuals with active bench warrants for unrelated offenses. Several years ago, Baltimore police used face recognition amid protests to find individuals with “outstanding warrants and arrest[ed] them directly from the crowd,” in a selective effort that appeared to be aimed at disrupting, punishing, and discouraging demonstrators from protesting.32
Federal Use of Face Recognition Should be Seriously Curtailed to Protect Civil Rights and Civil Liberties
Federal use of face recognition contains serious flaws, creating potential for both overreliance on misidentifications as well as misuse. The FBI systems guarantee that match results are returned for any scan, amplifying the risk of errors, as does the lack of oversight that stems from failure to disclose the use of face recognition to courts and defendants. FBI officials can conduct face recognition searches pursuant to both criminal investigations or mere open assessments, meaning that searches do not require probable cause, or even suspicion of wrongdoing.33
ICE face recognition systems can similarly be used for photos that are “in furtherance of ongoing investigations,” and can be used to support immigration detention and deportation so long as the scan is not conducted “solely in furtherance of civil immigration enforcement.”34
The risks of face recognition being misused should be taken especially seriously in light of how federal law enforcement has used surveillance to target protesters35 and the press in recent years,36 as well as lawmakers, their staff, and their families.37
Currently, the only meaningful restriction on federal law enforcement use of face recognition is the FBI requirement that matches cannot serve as the sole basis for arrests or other law enforcement action.38 But for the reasons previously described, requiring that face recognition is only used for leads does not eliminate the risk of error. And lack of disclosure to defendants removes a key safeguard in ensuring that even this rule is effectively applied.
In order to effectively reduce the dangers posed by face recognition surveillance, we recommend the White House establish the following policy requirements for all federal law enforcement use of the technology:
- Probable cause rule: Require that all scans be predicated on probable cause that the individual to be identified has committed, is committing, or is planning to commit the offense being investigated.
- Serious crime limit: Limit use of face recognition to the investigation of violent felonies.
- Disclosure requirements: Require that any use of face recognition during an investigation is disclosed to defendants.
Federal Assistance to State and Local Law Enforcement for Face Recognition Use is Dangerously Unregulated and Must be Reformed
In addition to its direct use by federal law enforcement, the federal government provides significant support for face recognition surveillance by state and local law enforcement, with inadequate safeguards.
The FBI Next Generation Identification-Interstate Photo System (NGI-IPS) provides state and local law enforcement across the country with the ability to run face recognition searches on a mass scale. Searches can be run through NGI-IPS so long as the relevant photos are “obtained pursuant to an authorized criminal investigation,” but does not require probable cause or even suspicion of wrongdoing. FBI policy authorizes use of its system to identify exercising First Amendment-protected activities (such as lawful assembly and protests) so long as the scanned photo is “pertinent to and within the scope of an authorized law enforcement activity.”39
These rules are insufficient to protect against pervasive surveillance and selective targeting, including the type of selective targeting that has previously been directed at peaceful protesters.
Further, FBI rules contain no known restrictions on which law enforcement entities are authorized to run scans through its systems, including those that may be under investigation by the Justice Department for systemic violation of constitutional rights — investigations the White House describes as “critical tools to promote constitutional policing in jurisdictions where reform is warranted.”40 This raises the important question of why the department would provide access to a powerful technology such as face recognition — which is susceptible to abuse and undermining constitutional rights — to police departments even as it acts to curtail systemic abuse within those departments.
Similar to its own policy rules, the FBI provides no meaningful restrictions on how state and local law enforcement use NGI-IPS for face recognition searches, other than to require that matches cannot serve as the sole basis for law enforcement actions such as arrests,41 an inadequate measure to guard against both overreliance on misidentifications as well as misuse. Even when NGI-IPS is used in a proper manner for legitimate law enforcement needs, the consistent pattern of leaving defendants uninformed undermines critical due process rights to review investigative evidence and techniques.
Further, the federal government should examine and act on its role in promoting inadequately restricted face recognition systems that are operated at the state and local level. Federal grant funding for local policing has directly been used to create face recognition systems.42 Federal funds are also used to develop mass video surveillance networks that power the collection of images run through local face recognition systems.43
The federal government bears responsibility if it — either by providing direct access to scans or funding for locally managed systems — enables state and local law enforcement to use face recognition in a manner that endangers civil rights and civil liberties, and should act to prevent such use. We recommend the White House establish the following policy requirements regarding assistance to state and local law enforcement involving face recognition:
- Probable cause rule: Only allow state and local law enforcement entities to run searches through NGI-IPS or other federal face recognition systems if scans are predicated on probable cause that the individual to be identified has committed, is committing, or is planning to commit the offense being investigated.
- Serious crime limit: Only allow state and local law enforcement entities to run searches through NGI-IPS or other federal face recognition systems if scans are for the investigation of violent felonies.
- Prohibit use to bad actors: Prohibit use of NGI-IPS or other federal face recognition systems by law enforcement entities under pattern and practice investigations for biased policing and other unconstitutional practices.
- Disclosure requirements: Require that any state and local law enforcement entity that uses NGI-IPS or other federal face recognition systems disclose such use to defendants.
- Funding contingent on reasonable regulations: Require that any state or local law enforcement entity that uses federal funding for face recognition or video surveillance that could be used for face recognition scans be conditioned on that entity abiding by these probable cause rules, serious crime limits, and disclosure requirements for its own use of face recognition.
Thank you for the opportunity to provide this comment in response to the White House Office of Science and Technology Policy’s request for information. We strongly hope the White House will adopt the recommended policies in support of its ongoing commitment to civil rights, civil liberties, and improving racial equity in criminal justice and policing.
Senior Policy Counsel
The Constitution Project at the Project On Government Oversight
The Constitution Project seeks to safeguard our constitutional rights when the government exercises power in the name of national security and domestic policing, including ensuring our institutions serve as a check on that power.
Drew Harwell and Erin Cox, “ICE has run facial-recognition searches on millions of Maryland drivers,” Washington Post, February 26, 2020, https://www.washingtonpost.com/technology/2020/02/26/ice-has-run-facial-recognition-searches-millions-maryland-drivers/.10
Drew Harwell and Erin Cox, “ICE has run facial-recognition searches on millions of Maryland drivers,” Washington Post, February 26, 2020, https://www.washingtonpost.com/technology/2020/02/26/ice-has-run-facial-recognition-searches-millions-maryland-drivers/ (“[CASA] now says ICE’s open access to MVA photos and other data was a main reason for the detentions”).11
Patrick Grother, Mei Ngan, Kayee Hanaoka, National Institute of Standards and Technology, Face Recognition Vendor Test (FRVT) Part 3: Demographic Effects, NISTIR 8280 (December 19, 2019), 2, https://doi.org/10.6028/NIST.I...; Joy Buolamwini and Timnit Gebru, “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification,” Proceedings of Machine Learning Research, vol. 81 (2018), http://proceedings.mlr.press/v...; Joy Buolamwini and Inioluwa Deborah Raji, “Actionable Auditing: Investigating the Impact of Publicly Naming Biased Performance Results of Commercial AI Products,” AIES ’19: Proceedings of the 2019 AAAI/ACM Conference on AI, Ethics, and Society (2019), https://www.media.mit.edu/publications/actionable-auditing-investigating-the-impact-of-publicly-naming-biased-performance-results-of-commercial-ai-products/; Jacob Snow, “Amazon’s Face Recognition Falsely Matched 28 Members of Congress With Mugshots,” American Civil Liberties Union, July 26, 2018, https://www.aclu.org/blog/privacy-technology/surveillance-technologies/amazons-face-recognitionfalsely-matched-28; Brendan Klare et al., “Face Recognition Performance: Role of Demographic Information,” IEEE Transactions on Information Forensics and Security, vol. 7, no. 6 (December 2012), http://openbiometrics.org/publications/klare2012demographics.pdf.12
Grother, Ngan, Hanaoka, Face Recognition Vendor Test (FRVT) Part 3: Demographic Effects, 2.13
Task Force on Facial Recognition Surveillance, Project On Government Oversight, Facing the Future of Surveillance (March 4, 2019), Sec. II. https://www.pogo.org/report/2019/03/facing-the-future-of-surveillance/.14
“CCTV feeds facial recognition systems for law enforcement,” Biometric Technology Today, vol. 2015, no. 4 (April 2015): 3, https://www.sciencedirect.com/science/article/abs/pii/S0969476515300539.15
Jim Trainum, “Facial Recognition Surveillance Doesn’t Necessarily Make You Safer,” Project On Government Oversight, July 22, 2019, https://www.pogo.org/analysis/2019/07/facial-recognition-surveillance-doesnt-necessarily-make-you-safer/; According to then-FBI Deputy Assistant Director Kimberly Del Greco, its system is set up so that it “returns a gallery of ‘candidate’ photos [reference photos] of 2-50 individuals (the default is 20).” Facial Recognition Technology (Part II): Ensuring Transparency in Government Use: Hearing before the House Committee on Oversight, 116th Cong. (June 4, 2019) (statement by Kimberly Del Greco, Deputy Assistant Director, FBI Criminal Justice Information Services Division), https://www.fbi.gov/news/testimony/facial-recognition-technology-ensuring-transparency-in-government-use; Erin M. Priest, Privacy and Civil Liberties Officer, FBI, Privacy Impact Assessment for the Next Generation Identification-Interstate Photo System (May 2019), https://www.fbi.gov/file-repository/pia-ngi-interstate-photo-system.pdf/view
(“A gallery of two to fifty photos will be returned, with the law enforcement agency choosing the size of the gallery. If no choice is made, a default of twenty photos is returned.”).
For example, during a 2020 Congressional hearing, FBI Director Christopher Wray responded to inquiries on face recognition by stating, “We use it for lead value. We don’t use facial recognition as a basis to arrest or convict.” House Judiciary Committee. Oversight of the Federal Bureau of Investigation: Hearing before the House Judiciary Committee, 116th Cong. (February 5, 2020), https://judiciary.house.gov/calendar/eventsingle.aspx?EventID=2780.17
See President’s Council of Advisors on Science and Technology, Forensic Science in Criminal Courts: Ensuring Scientific Validity of Feature-Comparison Methods (September 2016), https://obamawhitehouse.archives.gov/sites/default/files/microsites/ostp/PCAST/pcast_forensic_science_report_final.pdf.18
Joseph Stromberg, “Lie detectors: Why they don’t work, and why police use them anyway,” Vox, December 15, 2014, https://www.vox.com/2014/8/14/5999119/polygraphs-lie-detectors-do-they-work.19
For example, in one incident, New York City Police Department officers allegedly took a face recognition match, and then rather than try to legitimately confirm or disconfirm its accuracy, instead texted a witness, “Is this the guy…?” along with a single photo, rather than following proper procedure to use a photo array. Clare Garvie, “Garbage In, Garbage Out: Face Recognition on Flawed Data,” Georgetown Law Center on Privacy & Technology, May 16, 2019, https://www.flawedfacedata.com/.20
Facial Recognition Technology: Examining Its Use By Law Enforcement: Hearing before the House Judiciary Committee Subcommittee on Crime, Terrorism, and Homeland Security, 117th Cong. (July 13, 2021) (statement of Robert Williams), https://docs.house.gov/meetings/JU/JU08/20210713/113906/HMTG-117-JU08-Wstate-WilliamsR-20210713.pdf.21
Facial Recognition Technology: Examining Its Use By Law Enforcement: Hearing before the House Judiciary Committee Subcommittee on Crime, Terrorism, and Homeland Security, 117th Cong. (July 13, 2021) (statement of Robert Williams), https://docs.house.gov/meetings/JU/JU08/20210713/113906/HMTG-117-JU08-Wstate-WilliamsR-20210713.pdf.22
Kashmir Hill, “Wrongfully Accused By An Algorithm,” New York Times, June 24, 2020, https://www.nytimes.com/2020/0...; K. Holt, “Facial recognition linked to a second wrongful arrest by Detroit police,” Engadget, July 10, 2020, https://www.engadget.com/facial-recognition-false-match-wrongful-arrest-224053761.html; Kashmir Hill, “Another Arrest, and Jail Time, Due to a Bad Facial Recognition Match,” New York Times, December 29, 2020, https://www.nytimes.com/2020/12/29/technology/facial-recognition-misidentify-jail.html.23 Jennifer Valentino-DeVries, “How the Police Use Facial Recognition, and Where It Falls Short,” New York Times, January 12, 2020, https://www.nytimes.com/2020/01/12/technology/facial-recognition-police.html. 24 Aaron Mak, “Facing Facts,” Slate, January 25, 2019, https://slate.com/technology/2019/01/facial-recognition- arrest-transparency-willie-allen-lynch.html. 25 Jennifer Valentino-DeVries, “How the Police Use Facial Recognition, and Where It Falls Short,” New York Times, January 12, 2020, https://www.nytimes.com/2020/01/12/technology/facial-recognition-police.html. 26 Face recognition “can play a significant role in investigations, though, without the judicial scrutiny applied to more proven forensic technologies.” Jennifer Valentino-DeVries, “How the Police Use Facial Recognition, and Where It Falls Short,” New York Times, January 12, 2020, https://www.nytimes.com/2020/01/12/technology/facial-recognition-police.html. 27 United States v. Jones, 565 U.S. 400, 415-16 (2012) (Sotomayor, J., concurring). 28 Alfred Ng, “How China uses facial recognition to control human behavior,” CNet, August 11, 2020, https://www.cnet.com/news/in-china-facial-recognition-public-shaming-and-control-go-hand-in-hand/. 29 Paul Mozur, ”One Month, 500,000 Face Scans: How China Is Using A.I. to Profile a Minority,” New York Times, April 14, 2019, https://www.nytimes.com/2019/04/14/technology/china-surveillance-artificial-intelligence-racial-profiling.html; Drew Harwell and Eva Dou, “Huawei tested AI software that could recognize Uighur minorities and alert police, report says,” Washington Post, December 8, 2020, https://www.washingtonpost.com/technology/2020/12/08/huawei-tested-ai-software-that-could-recognize-uighur-minorities-alert-police-report-says/. 30 Paul Mozur, “In Hong Kong Protests, Faces Become Weapons,” New York Times, July 26, 2019, https://www.nytimes.com/2019/07/26/technology/hong-kong-protests-facial-recognition-surveillance.html; Stephan Kafeero, “Uganda is using Huawei’s facial recognition tech to crack down on dissent after anti-government protests,” Quartz Africa, November 27, 2020, https://qz.com/africa/1938976/uganda-uses-chinas-huawei-facial-recognition-to-snare-protesters/. 31 Joanne Cavanaugh Simpson and Marc Freeman, “South Florida police quietly ran facial recognition scans to identify peaceful protestors. Is that legal?” South Florida Sun Sentinel, June 26, 2021, https://www.sun-sentinel.com/local/broward/fl-ne-facial-recognition-protests-20210626-7sll5uuaqfbeba32rndlv3xwxi-htmlstory.html. 32 Kevin Rector and Alison Knezevich, “Social media companies rescind access to Geofeedia, which fed information to police during 2015 unrest,” Baltimore Sun, October 11, 2016, https://www.baltimoresun.com/news/crime/bs-md-geofeedia-update-20161011-story.html. 33 Facial Recognition Technology (Part II): Ensuring Transparency in Government Use: Hearing before the House Committee on Oversight, 116th Cong. (June 4, 2019) (statement by Kimberly Del Greco, Deputy Assistant Director, FBI Criminal Justice Information Services Division), https://www.fbi.gov/news/testimony/facial-recognition-technology-ensuring-transparency-in-government-use; Erin M. Priest, Privacy and Civil Liberties Officer, FBI, Privacy Impact Assessment for the Facial Analysis, Comparison, and Evaluation (FACE) Phase II System (J2019). 34 Department of Homeland Security, U.S. Immigration and Customs Enforcement, “Privacy Impact Assessment for the ICE Use of Facial Recognition Services,” (May 13, 2020) https://www.dhs.gov/sites/default/files/publications/privacy-pia-ice-frs-054-may2020.pdf. 35 Jimmy Tobias, “Exclusive: ICE Has Kept Tabs on ‘Anti-Trump’ Protesters in New York City,” The Nation, March 6, 2019, https://www.thenation.com/article/archive/ice-immigration-protest-spreadsheet-tracking/. 36 Tom Jones, Mari Payton, and Bill Feather, “Source: Leaked Documents Show the U.S. Government Tracking Journalists and Immigration Advocates Through a Secret Database,” NBC 7, March 6, 2019,
https://www.nbcsandiego.com/investigations/Source-Leaked-Documents-Show-the-US-Government-Tracking-Journalists-and-Advocates-Through-a-Secret-Database-506783231.html; Ryan Devereaux, “Journalists, Lawyers, And Activists Working On The Border Face Coordinated Harassment from U.S. and Mexican Authorities,” The Intercept, February 8, 2019, https://theintercept.com/2019/02/08/us-mexico-border-journalists-harassment/; Charlie Savage and Katie Benner, “Trump Administration Secretly Seized Phone Records of Times Reporters,” New York Times, June 11, 2021, https://www.nytimes.com/2021/06/02/us/trump-administration-phone-records-times-reporters.html; Jana Winter, ”Operation Whistle Pig: Inside the secret CBP unit with no rules that investigates Americans,” Yahoo News, December 11, 2021, https://news.yahoo.com/operation-whistle-pig-inside-the-secret-cbp-unit-with-no-rules-that-investigates-americans-100000147.html?guccounter=1. 37 Katie Benner, Nicholas Fandos, Michael S. Schmidt, and Adam Goldman, “Hunting Leaks, Trump Officials Focused on Democrats in Congress,” New York Times, June 14, 2021, https://www.nytimes.com/2021/06/10/us/politics/justice-department-leaks-trump-administration.html. 38 Erin M. Priest, Privacy and Civil Liberties Officer, FBI, Privacy Impact Assessment for the Next Generation Identification-Interstate Photo System (May 2019), https://www.fbi.gov/file-repository/pia-ngi-interstate-photo-system.pdf/view. 39 Erin M. Priest, Privacy and Civil Liberties Officer, FBI, Privacy Impact Assessment for the Next Generation Identification-Interstate Photo System (May 2019), https://www.fbi.gov/file-repository/pia-ngi-interstate-photo-system.pdf/view. 40 The White House, “FACT SHEET: The Biden-Harris Administration is Taking Action to Restore and Strengthen American Democracy,” December 8, 2021, https://www.whitehouse.gov/briefing-room/statements-releases/2021/12/08/fact-sheet-the-biden-harris-administration-is-taking-action-to-restore-and-strengthen-american-democracy/. 41 Erin M. Priest, Privacy and Civil Liberties Officer, FBI, Privacy Impact Assessment for the Next Generation Identification-Interstate Photo System (May 2019), https://www.fbi.gov/file-repository/pia-ngi-interstate-photo-system.pdf/view. 42
See e.g., Department of Justice Bureau of Justice Assistance, “Facial Recognition Technology,” September 6, 2016, https://bja.ojp.gov/funding/awards/2016-dj-bx-1049.43 See e.g., Aaron Mondry, “Criticism mounts over Detroit Police Department’s facial recognition software,” Curbed Detroit, July 8, 2019, https://detroit.curbed.com/2019/7/8/20687045/project-green-light-detroit-facial-recognition-technology.