Testimony

Recommendations on Massachusetts Face Recognition Legislative Reforms

(Illustration: Renzo Velez / POGO)

Submitted via email

Dear Senator Eldridge, Representative Day, and members of the commission:

I am writing on behalf of The Constitution Project at the Project On Government Oversight about the need for serious limits on use of face recognition technology. We respectfully ask the commission to recommend the legislature strengthen existing facial recognition law to ensure Massachusetts residents and visitors are shielded from overbroad and pervasive surveillance.

Founded in 1981, the Project On Government Oversight (POGO) is a nonpartisan independent watchdog that investigates and exposes waste, corruption, abuse of power, and when the government fails to serve the public or silences those who report wrongdoing. We champion reforms to achieve a more effective, ethical, and accountable federal government that safeguards constitutional principles. The Constitution Project at POGO centers its work on issues such as guarding against improper and overbroad surveillance, including unchecked face recognition. In 2019, The Constitution Project convened a task force of expert stakeholders including academics, tech experts, civil rights and civil liberties advocates, and law enforcement officials to examine the impact of face recognition surveillance.1 It concluded that any law enforcement use of face recognition should be subject to strong limits, and it provided a set of policy recommendations to support legislatures in the creation of reasonable but necessary limits.

The new law’s policies governing police use of the technology, however, are insufficient as safeguards to protect civil rights and civil liberties.

In December 2020, Governor Charlie Baker signed into law “An Act Relative to Justice, Equity and Accountability in Law Enforcement in the Commonwealth,” an omnibus police reform bill. The law, codified in Chapter 253 of the Acts of 2020, contains several provisions pertaining to government agencies’ use of face recognition technology. We support some of those provisions, such as the creation of this commission and the development of independent authorization standards.

The new law’s policies governing police use of the technology, however, are insufficient as safeguards to protect civil rights and civil liberties. Fortunately, lawmakers have addressed these concerns in legislation filed this session in each chamber: H.135, An Act to Regulate Face Surveillance, sponsored by Representatives David Rogers and Orlando Ramos, and S.47, An Act to Regulate Face Surveillance, sponsored by Senator Cynthia Stone Creem.

This bill would improve rules on face recognition in a variety of ways, but there are additional safeguards we believe should be added as well. In particular, we recommend that legislation limiting face recognition include at least five key policy priorities:

  1. Requiring that face recognition searches are based on probable cause
  2. Limiting use of face recognition to the investigation of serious crimes
  3. Prohibiting face recognition from being used as the sole basis for arrests
  4. Requiring notice to defendants whenever face recognition is used
  5. Prohibiting face recognition from being used for untargeted surveillance

Require Face Recognition Searches Be Based on Probable Cause

Requiring that law enforcement demonstrate probable cause that an unknown person in question has committed a crime before they use face recognition to identify that individual is a critical safeguard for preventing abuse. The primary police use for face recognition is to scan photographs of individuals taken during commission of a crime; demonstrating probable cause in such scenarios should not be an onerous burden for supporting legitimate law enforcement goals.

Without a probable cause requirement, police could also use face recognition as a dragnet surveillance tool, scanning, identifying, and cataloging individuals’ activities on a mass scale.

This requirement is essential to stopping face recognition from being used to catalog and target individuals engaged in constitutionally protected activities such as protesting, participating in political rallies, or attending religious services. The danger of this surveillance technology being misused in such a manner is not theoretical: Police have used face recognition on multiple occasions in recent years to identify peaceful civil rights protesters.2 Without a probable cause requirement, police could also use face recognition as a dragnet surveillance tool, scanning, identifying, and cataloging individuals’ activities on a mass scale.

While last year’s legislation establishes some requirements for police to conduct face recognition searches, the probable cause requirement in H.135 and S.47 would prevent mass surveillance and abuse without hampering legitimate law enforcement uses. Further, the legislation makes sensible exceptions, such as an emergency involving immediate danger or the identification of a deceased person.3 Raising the standard to probable cause would still support law enforcement needs while closing the door to fishing and excess scans to identify people, including during sensitive activities and interactions.

Limit Use of Face Recognition to Investigating Serious Crimes

Another key pillar of effective safeguards on face recognition is limiting its use to the investigation of serious crimes. The concept of limiting use of powerful surveillance tools to top-tier investigations has clear precedent: It has been applied for over 50 years in similar surveillance contexts such as wiretapping.4

Face recognition should not be used to stockpile suspects for investigation of minor offenses.

In many places, police already use face recognition to investigate minor offenses such as shoplifting less than $15 of goods or stealing a pack of beer.5 These low-profile cases often receive little scrutiny, so it is more likely that erroneous uses of face recognition—which can stem from a variety of causes such as algorithmic bias, poor image quality, lax software settings, or even pseudo-scientific techniques6—will go unnoticed. For minor offences, it’s also more likely that potentially exculpatory evidence will not be sought out. This is especially concerning because face recognition can be notoriously inaccurate, especially for women and people of color.7

Face recognition should not be used to stockpile suspects for investigation of minor offenses.

A serious crime limit for face recognition would also prevent the misuse of discretionary powers, including selective targeting of marginalized communities and dissidents. For example, in 2015, as demonstrators protested the death of Freddie Gray in police custody, Baltimore police used face recognition to target protesters, scanning the crowd with the technology to find and arrest anyone who had an outstanding warrant for any offense.8 Without a serious crime limit, face recognition could be used in this manner on a broad scale, weaponized for selective enforcement of bench warrants for minor offenses and targeted at marginalized communities, political dissidents, and other vulnerable individuals.

By restricting use of the technology to investigating violent felonies, H.135 and S.47 prevent these risks while still permitting limited use for investigating offenses, such as homicides, that are genuine public safety priorities.9

Prohibit Face Recognition from Being Used as the Sole Basis for an Arrest

In addition to the measures above designed to prevent abuse and excess surveillance, effective safeguards on face recognition must also include policies that prevent excess reliance on the technology, which can be prone to error. There are already numerous documented instances when a face recognition misidentification led to a wrongful arrest.10 While a probable cause warrant to run scans provides significant value, it does not prevent the harms that can arise when law enforcement excessively relies on the results of searches.

Factors that reduce image quality, such as bad lighting, indirect angles, distance, poor cameras, and low image resolution, all make misidentifications more likely. Lax system settings, such as employing a lower confidence threshold to trigger matches11 or having broad sets of matches appear in search results, increase the potential that law enforcement will receive erroneous matches as well. Even as face recognition software improves in quality—and even if algorithmic bias dissipates—there will always be situation-based limits to how effective the technology is. And there will always be a danger in giving too much credence to matches that could misidentify innocent individuals.

Current law does not place a limit on how much police can rely on face recognition matches, and unfortunately, as currently written, H.135 and S.47 do not remedy this problem. We recommend these bills be strengthened to prevent face recognition matches from being the sole basis for an arrest, or on their own represent probable cause for a search. This is a key safeguard to preventing harm from over-reliance on misidentifications.

Require Defendants Be Given Notice When Face Recognition Is Used

Like any other complex forensic tool, face recognition’s effectiveness can depend on technical factors and manner of use. That is why it is critical that defendants are notified and given the opportunity to examine face recognition technology whenever it is used in an investigation.

Defendants have a vested interest in reviewing a variety of factors, such as algorithm quality, the software settings police used, and whether any other potential matches were discovered or investigated, that could provide exculpatory or mitigating evidence. Guaranteeing access to this information is not only critical for due process rights but also acts as an important safeguard to deter corner cutting and sloppy use of face recognition during investigations.

Despite the importance of disclosure, it rarely occurs.12 In some jurisdictions, law enforcement uses facial recognition thousands of times per month, and defendants almost never receive notice of its use in investigations.13 Yet even as law enforcement relies on the technology for investigations, they obscure it from examination in court by defendants and judges.14 In Massachusetts, current law does not provide any due process protections for criminal defendants who have been subject to the use of facial recognition systems.

H.135 and S.47 require that law enforcement agencies and district attorneys make available to criminal defendants and their attorneys all records and information pertaining to face recognition searches performed or requested during investigations.15 This requirement would protect due process rights and strongly incentivize law enforcement to adhere to the highest standards in their use of face recognition.

Prohibit Face Recognition from Being Used for Untargeted Surveillance

One especially dangerous form of face recognition is in its use not to identify a designated person in an image but rather to conduct untargeted scans of all individuals on video streams.

According to early testing, untargeted face recognition is notoriously inaccurate: In pilot programs in the United Kingdom, South Wales Police had a 91% error rate and London Metropolitan Police had a 98% error rate.16 In the United States, the technology has not received any type of comparable tests or vetting.

Yet even if untargeted face recognition improves in accuracy, it would still present a serious threat to civil rights and civil liberties. This type of surveillance system could effortlessly monitor individuals’ movements, interactions, and activities on a mass scale.

Current law does not regulate the use of face recognition for purposes of surveillance of public spaces, leaving open the possibility of video surveillance systems incorporating untargeted face recognition in the future. H.135 and S.47 would pre-empt this danger by prohibiting the use of untargeted face recognition for general public surveillance.17

We encourage you to consider bills H.135 and S.47 when you decide on further regulations of face recognition technology. We need strong safeguards to ensure that this technology does not infringe on civil rights and civil liberties, and this legislation offers an effective path for achieving this important goal.

Thank you for your attention and consideration.

Jake Laperruque
Senior Counsel
The Constitution Project at the Project On Government Oversight