Facial Recognition Software Sparks Transparency Battle

By Jack Karp | November 3, 2019, 8:02 PM EST

Willie Allen Lynch, allegedly captured in these photos selling $50 worth of crack cocaine, wasn't told until eight days before his final pretrial conference he had been identified using facial recognition technology.


A blurry cellphone photo. That's what Jacksonville, Florida, police used to identify and arrest Willie Allen Lynch for selling $50 worth of crack cocaine.

Two undercover officers bought the drugs from a black man identified only as "Midnight," and, though they didn't arrest him, one of them surreptitiously took several photos of him while pretending to make a call, according to court documents.

A crime analyst ran the photos through a facial recognition program, which spit out several potential matches, one of whom was Lynch, the documents said.

Willie Allen Lynch unsuccessfully appealed his conviction and eight-year prison sentence based on arguments that police improperly failed to disclose details surrounding the use of facial recognition technology in his arrest.

But Lynch, who was later convicted, says he was misidentified, and transparency concerns have loomed large over the case. The software produced other matches, but only his was forwarded to police, a fact that was kept from Lynch, said Somil Trivedi, a senior staff attorney at the American Civil Liberties Union's Criminal Law Reform Project who filed an amicus brief in Lynch's appeal. He also never was told the program had given his photo a one-star rating as a potential match, Trivedi said.

"Now, because we haven't gotten access to all of the information, which is the point of the case, we don't know what a one-star rating means," Trivedi said. "But I think anyone who's used Yelp can figure out that's not a high degree of confidence in your restaurant choice for lunch."

The "unreliability" of identifications like these "is hard to overstate," Trivedi said. Critics of the technology point to studies that have shown facial recognition programs are particularly bad at identifying women and people of color, like Lynch. In his case, prosecutors said the use of another photo at trial for comparison purposes undercut any claims of misidentification.

Police haven't let the technology's alleged fallibility stop them from expanding their use of the software, including in places with large minority populations like Detroit, Chicago and New York — a fact many departments are keeping to themselves.

They insist the technology is an important new crime-fighting tool that is no different from fingerprints or DNA. And it has produced some major wins, such as successfully identifying the Boston Marathon bombers as well as a man who caused a bomb scare in New York City.

But privacy and social justice advocates are fighting to limit or ban police use of the technology because of bias and transparency concerns. Those efforts have succeeded in cities like Somerville, Massachusetts, and San Francisco, and across the state of California.

Now, Michigan, New York and other states, perhaps even Congress, may follow suit.

"We have this sort of inherent bias toward technology as infallible, and we know that this technology is far from infallible," Trivedi said.

'We Don't Even Know About It'

About a quarter of law enforcement agencies in the U.S. have access to a facial recognition system, and over half of American adults are in a facial recognition database, according to Jameson Spivack, policy associate at the Center on Privacy & Technology at Georgetown University Law Center.

The FBI, meanwhile, has performed 390,186 facial recognition searches, and between 2017 and April 2019 received 152,565 law enforcement requests for searches, according to June testimony before the House Oversight and Reform Committee from Deputy Assistant Director Kimberly J. Del Greco.

That public testimony is rare. Many law enforcement agencies have been less than forthcoming about their use of the technology, according to activists, and it's difficult to say which police departments are employing it or how.

Both the New York and Los Angeles police departments, for instance, denied records requests about their facial recognition systems from Georgetown Law's Center on Privacy & Technology despite both departments having previously acknowledged their use, according to a report from the center.

And the Chicago Police Department said in 2016 that it doesn't use the technology in real-time situations, according to that same report, even though facial recognition company DataWorks Plus said in 2013 that Chicago had purchased a facial recognition system that "uses the Chicago Police Department[']s mugshot database comprised of over 3 million records with a real time database update feed."

In Jacksonville, Lynch wasn't told he had been identified using facial recognition until eight days before his final pretrial conference. When contacted, the Jacksonville Sheriff's Office said only that it "has yet to purchase any facial recognition software and it has not been decided if we will at this time," despite the fact that prosecutors, the court, Lynch's lawyer and the ACLU all agreed it used the technology to identify Lynch.

"Law enforcement agencies are notoriously secretive about their acquisition and use of new surveillance technologies," said Evan Greer, deputy director of Fight for the Future, a nonprofit working to ban government use of facial recognition. "It's very likely that there are law enforcement agencies in the U.S. that are currently using facial recognition technology and we don't even know about it."

Until recently, that was true in Detroit.

The Detroit Police Department has been using facial recognition for two years, but most Detroiters only found that out in May when Georgetown Law's Center on Privacy & Technology published a study of the program.

Detroit purchased its system in July 2017 from DataWorks Plus, just one of the many companies angling their way into the booming facial recognition market, which includes Google, Apple, Facebook, Amazon and Microsoft.

Police use facial recognition systems like Detroit's two ways, according to Andrew Ferguson, author of "The Rise of Big Data Policing." The first is face identification, when the software compares a face, usually from video footage, to a database of faces to identify a suspect after a crime, something Ferguson said "is relatively common now, both on federal and state levels."

The second, face surveillance, involves conducting real-time identification of faces in crowds or stored video. For the most part, police aren't doing that in the U.S. — yet, according to Ferguson. But "the technical capacity exists," and in places like China, it's already a reality, he said.

The secrecy around Detroit's program means citizens have "not a clue" about how DPD is using it, said Eric C. Williams, senior staff attorney at the Detroit Justice Center, and until recently, there were no regulations governing the police's use of the technology.

But proponents of the technology say using facial recognition to solve crimes is no different from using fingerprints or DNA, and they emphasize that no one is arrested based solely on facial recognition.

"Whether you're looking at a fingerprint database to find the fingerprint left at the countertop of the bank or looking through a mugshot photo repository to find the subject of a face that was left from a surveillance image, the outcome is the same," said Todd Pastorini, executive vice president and general manager of DataWorks Plus, the company contracting with Detroit police.

Police are still "doing all their detective work and everything else involved that they would normally do in a case to put that subject at the scene of the crime," he said.

In the wake of community backlash to the Georgetown report, Detroit's Board of Police Commissioners in September approved guidelines prohibiting facial recognition from being the sole basis for an arrest and preventing police from using it on mobile or live-streaming video, Williams said. But advocates have concerns over how the guidelines will be enforced.

"Having the police say, 'We're not going to use it for this, we pinky-swear,' is troubling," Williams said.

Technology Can Be Biased, Too

That lack of transparency masks what activists say is a bigger problem with facial recognition: its inaccuracy, specifically with darker and female faces.

Researchers at the MIT Media Lab published a study in 2018 that tested facial recognition programs from Microsoft, IBM and Chinese company Face++. While the programs made mistakes less than 1% of the time when identifying light-skinned males, Microsoft's error rate when tasked with identifying darker-skinned women was 21%. The error rates for IBM and Face++ were nearly 35%.

The ACLU made this point more vividly when it tested Amazon's facial recognition program on members of Congress. The software incorrectly flagged 28 congressmen and women as criminals, according to the ACLU, which said the false matches included six members of the Congressional Black Caucus. The program did just as poorly when the organization tested it on New England athletes from the Bruins, Celtics, Red Sox and Patriots this last month, falsely identifying 27 of them — 13 of who are minority — as criminals.

"Because many of the early datasets were tested and created around white males, they tend to do an okay job of identifying white men," Ferguson said. "In the context of identifying women, or women of color, or nonbinary people, they do a pretty poor job."

Because black and Latinx people are more heavily policed, they are more likely to end up in the mugshot databases searched using facial recognition, Spivack said, "meaning the people it's used the most on are those it works the worst on."

This bias has led to several recent false identifications, activists say.

In April, Brown University student Amara Majeed, who is from Sri Lanka, was mistakenly picked out by facial recognition software as a suspect in a terrorist attack in that country, leading to her receiving death threats.

That same month, New Yorker Ousmane Bah, who is African American, sued Apple for $1 billion after he was arrested when, according to his complaint, Apple mistakenly linked Bah's name in its facial recognition system to the face of a man who had robbed a series of Apple stores.

"If you ever said, 'There is a 90% chance that this technology will misidentify white people, but we won't make these kind of identification mistakes on black people,' I guarantee you there's no place in the country that would implement this," Williams said.

But prosecutors in the Lynch case said the risk of misidentification had been minimized. When Lynch challenged his conviction with a Florida appellate court, the government argued in a brief that the photos taken by police during the drug buy and the one of Lynch taken at his arrest "mitigated the risk of misidentification by the detectives."

The appellate court agreed in 2018, saying in its opinion denying Lynch's appeal that "the jury convicted only after comparing the photo the officers took to Lynch himself and to confirmed photos of Lynch."

In July, the Florida Supreme Court chose not to hear Lynch's further appeal of his conviction, and Lynch is currently serving out his eight-year sentence.

Setting Limits

Facial recognition software has developed so quickly and "under the radar," according to Ferguson, that there are few laws or guidelines governing its use. Recently, however, a handful of cities and states have set limits or bans on police use of the technology.

San Francisco became the first city to prohibit government use of facial recognition in May, with Somerville, Massachusetts, following suit in June, Oakland joining them in July and Berkeley getting on board in October.

The bans in San Francisco, Oakland and Berkeley apply to all city departments, including police, according to Brian Hofer, chair and executive director of Secure Justice, who was instrumental in getting those bans passed.

"I think it's the most intrusive technology developed in our lifetime. It's bigger than wiretaps, it's way more invasive than license plate readers," Hofer said. "I can evade a license plate reader by taking public transit or walking or car-share or something, but I'm stuck with this face."

Hofer is also part of a coalition that successfully worked to ban facial recognition use with police-worn body cameras throughout California for three years, a law signed by California's governor in October.

Bans are also being discussed in Emeryville, California; Cambridge, Massachusetts; and Portland, Oregon, according to Spivack. The New York City Council recently expressed renewed interest in a bill, first introduced in 2018, that would regulate police use of the technology after it came to light in August that the NYPD was using it on children and teens. And in May, the House Committee on Oversight and Reform began holding hearings on the possibility of regulating the federal government's use of the technology.

"I think we are beginning to see departments and cities come to the realization that if they're going to use this technology, they need to regulate it and they need to have citizen input in that regulation," Ferguson said.

Law enforcement, though, insists those bans are wrong-headed.

"Any time we have a technology that's going to help us be more professional and have a better ability to perform our job, we want to make sure that we embrace that," said Ronald Lawrence, president of the California Police Chiefs Association, which lobbied against California's body camera bill, whittling it down from a total ban to a three-year moratorium.

The investigation of the Boston Marathon bombing is "a perfect example" of how facial recognition can be a boon to both police and the public, said Lawrence, who pointed out that the technology "was critical to solving that case."

Advocates for the technology point to other successes as well, including the NYPD's use of facial recognition to identify the man who caused chaos when he left a suspicious pair of rice cookers in a New York City subway station in August.

"I've personally seen hundreds of cases by our customers that they've shown me from surveillance from banks, from 7-Elevens, from smash-and-grabs, murder ... that they've been able to use facial recognition to track down the right individual," said DataWorks Plus' Pastorini.

Despite these successes, activists continue to fight law enforcement's use of the technology. Hofer says he is currently working with four other cities on banning facial recognition software.

"The infrastructure is already in place — there's cameras everywhere — the expensive part's done," he said. "They just need to apply software, which is getting cheaper and cheaper by the day. And if we allow it in those narrow instances, it's going to be everywhere. We have to keep the genie in the bottle."

Have a story idea for Access to Justice? Reach us at accesstojustice@law360.com.

--Editing by Katherine Rautenberg.

Hello! I'm Law360's automated support bot.

How can I help you today?

For example, you can type:
  • I forgot my password
  • I took a free trial but didn't get a verification email
  • How do I sign up for a newsletter?
Ask a question!