AI is Putting Innocent People at Risk of Being Incarcerated

  • <<
  • >>

610977.jpg

 

by Alyxaundria Sanford, Innocence Project

Robert Williams thought it was a prank call that his wife received saying he needed to turn himself in to the police. But when the Michigan resident pulled into his driveway, the Detroit police officer who had been waiting outside Williams’ home pulled up behind him, got out of the car and placed him under arrest. He was detained for 30 hours.

Williams’ encounter with the police that day in January 2020 was the first documented case of wrongful arrest due to the use of facial recognition technology (FRT). He was accused of stealing thousands of dollars’ worth of Shinola watches. Grainy surveillance footage provided to law enforcement was run through facial recognition software and matched to an expired driver’s license photo of Williams.

There are at least seven confirmed cases of misidentification due to the use of facial recognition technology, six of which involve Black people who have been wrongfully accused: Nijeer ParksPorcha WoodruffMichael OliverRandall ReidAlonzo Sawyer, and Robert Williams.

There has been concern that FRT and other artificial intelligence (AI) technologies will exacerbate racial inequities in policing and the criminal legal system. Research shows that facial recognition software is significantly less reliable for people of color, especially Black and Asian people, as algorithms struggle to distinguish facial features and darker skin tones. Another study concluded that disproportionate arrests of Black people by law enforcement agencies using FRT may be the result of “the lack of Black faces in the algorithms’ training data sets, a belief that these programs are infallible and a tendency of officers’ own biases to magnify these issues.” 

What is particularly worrying is that the adoption and use of AI, such as FRT, by law enforcement echoes previous examples of the misapplication of forensic science including bite mark analysishair comparisons, and arson investigation that have led to numerous wrongful convictions.

“The technology that was just supposed to be for investigation is now being proffered at trial as direct evidence of guilt. Often without ever having been subject to any kind of scrutiny,” said Chris Fabricant, Innocence Project’s director of strategic litigation and author of Junk Science and the American Criminal Justice System.

“Corporations are making claims about the abilities of these techniques that are only supported by self-funded literature,” said Fabricant. “Politicians and law enforcement that spend [a lot of money] acquiring them, then are encouraged to tout their efficacy and the use of this technology.” 

DNA has been essential in proving the innocence of people who have been wrongfully convicted for decades as a result of faulty forensic methods. Indeed, half of all DNA exonerations were the result of false or misleading forensic evidence. And of the 375 DNA exonerations in the United States between 1989 and 2020, 60% of the people freed were Black, according to the National Registry of Exonerations.

Still, not all cases lend themselves to DNA exonerations, especially those that involve police use of AI. For this reason, the Innocence Project is proactively pursuing pretrial litigation and policy advocacy to prevent the use of unreliable AI technology, and the misuse of even potentially reliable AI technology, before the damage is done.

“Many of these cases …  are just as susceptible to the same risks and factors that we’ve seen produce wrongful convictions in the past,” said Mitha Nandagopalan, a staff attorney in Innocence Project’s strategic litigation department. 

Nandagopalan is leading the Innocence Project’s latest endeavor to counter the potentially damaging effects of AI in policing, particularly in communities of color. 

“What is often seen in poorer neighborhoods or primarily [communities of color] is surveillance that is imposed by state and municipal actors, sometimes in line with the wishes of that community and sometimes not. It’s the people who live there that are the target in their own neighborhoods,” said Nandagopalan. “In wealthier neighborhoods, whiter neighborhoods, I think you often see surveillance that is being bought and invited by residents.” These technologies include Ring doorbell cameras and homeowners’ associations contracting license plate reader companies such as FLOCK Safety.

The Neighborhood Project is a multidisciplinary effort to understand how surveillance technologies, like FRT, impact a community and may contribute to wrongful convictions. The project will focus on a particular location and partner with community members and local organizations to challenge and prevent the use of unreliable and untested technologies.

“Ultimately, what we want is for the people who would be most impacted by the surveillance technology to have a say in whether, and how it’s getting used in their communities,” said Nandagopalan.

Last year, the Biden administration issued an executive order to set standards and manage the risk of AI including a standard to develop “tools, and tests to help ensure that AI systems are safe, secure, and trustworthy.” However, there are no federal policies currently in place to regulate the use of AI in policing.

In the meantime, there are ways for concerned community members to influence and encourage local leaders to regulate the use of these technologies by local law enforcement and other agencies. 

“These are great reasons to go to your local city council or town council meetings. That’s where these [tech presentations] happen. On the very local level, those representatives are the people who are voting whether to use tax dollars or public money to fund this stuff,” said Amanda Wallwin, one of the Innocence Project’s state policy advocates.

“If you can be there, if you’re in the room, you can make such a difference.”

Republished courtesy of The Innocence Project


 

Subscribe to our e-Newsletters
Stay up to date with the latest news, articles, and products for the lab. Plus, get special offers from Forensic – all delivered right to your inbox! Sign up now!