Attys Suspect AI In Police Surveillance Could Lead To Bias

By P.J. D'Annunzio | October 14, 2025, 3:44 PM EDT ·

A panel of Pennsylvania attorneys speaking on advances in the use of artificial intelligence in criminal justice and surveillance expressed concern over the potential misuse of such technologies, predicting they could result in rights violations on both individual and mass scales.

The discussion took place Friday at the Philadelphia Bench-Bar & Annual Conference at the Borgata Hotel Casino & Spa in Atlantic City. It spanned from the use of police body cameras to consumer security devices, to speculation as to whether a "Minority Report"-style system — a reference to the Philip K. Dick novella and later Stephen Spielberg film about a computer that can predict crimes — could be implemented in the U.S.

The panel members admitted there is utility to using AI for time-saving measures like summarizing deposition testimony, but they also noted that unfettered use by prosecutors and police without analyzing bias in programming can lead to serious consequences.

"I'm scared as hell because there are major issues," said panelist and criminal defense lawyer Troy H. Wilson of Wilson Law Offices in Philadelphia.

"What I learned early on dealing with these issues is it's not about the computer, it's about the physical person developing the algorithm, it's about the physical person putting his or her biases in the program," Wilson continued.

Chad Marlow, senior policy counsel at the American Civil Liberties Union, said tools like facial and gait recognition can be affected by biased programming in that it often displays reduced accuracy in identifying Black, female and elderly people.

Marlow also noted that AI used to summarize police reports and from audio captured from body cams is programmed to find evidence of guilt and may not look for exculpatory evidence, increasing the risk of prosecuting innocent people.

He said the risk is increased when law enforcement uses tools to identify potential suspects of crimes not yet committed, mentioning that such technology is especially dangerous in the hands of authoritarian governments. Marlow added that such a "Minority Report" system already exists in China and that the idea of predicting crimes is based on what amounts to analytical stereotypes of certain populations.

"Predictive policing follows in the garbage in, garbage out data analysis," Marlow said.

Additionally, Marlow said that in all aspects of life, people can exhibit "automation bias," or the belief that an AI-generated analysis is superior to human reasoning simply because it is the product of advanced technology.

"People are just deferring to computers," he said.

Recordings from police body cameras of interactions between officers and community members, whether the incidents they depict result in criminal charges or not, could be used to misinform AI as to what criminal activity looks like, said Catherine Twigg, who is general counsel to the city of Philadelphia's Citizens Police Oversight Commission.

"We're creating a trove of data that might look like it tells you what crime is like in Philadelphia, but it really tells you what policing is like in Philadelphia," Twigg said.

Twigg also expressed concern about what she saw as an overall lack of oversight in the tech industry, with AI being developed faster than the government can regulate it. She also pointed to consumer companies like Ring partnering with law enforcement agencies to allow access to doorbell cameras, turning unsuspecting residents' security devices into part of a mass surveillance apparatus.

--Editing by Lakshna Mehta.