x
Breaking News
More () »

Adding an algorithm: Mayor Paul Young, MPD plan for AI camera 'blanket' in efforts to curb crime

In places that used similar technology, such as New Orleans, people’s perceptions on crime did not necessarily change, researchers say.

MEMPHIS, Tenn. — Memphis Mayor Paul Young said he has a new addition in his plan to curb crime - a possible $15 million camera network loaded with artificial intelligence.

Young spoke at a town hall event in Hickory Hill on Oct. 22, and shared a preview of his administration’s plan.

“We’re making significant progress,” Young said. “In the next couple of weeks, we’ll be able to announce some things, but I will say that our plan has been to blanket our community with high-definition, 4K cameras where we have the artificial intelligence built in so you can identify negative activities.

“So if someone is breaking into a vehicle, or we’re looking for a certain type of vehicle that doesn’t have a license plate with a dent on the hood, these cameras will be able to pick them up.”

Young said those cameras would be able to immediately connect to crime centers, and the technology would complement law enforcement actions.

“Our goal is to get criminals off the street quickly,” Young said. “We want the people that are terrorizing our neighborhoods and communities off the streets. We’re going to use this technology to make sure we’re doing it.”

Young said the installation will be in stages, with the current installation phase being 200 cameras placed across the city. The first phase is set to cost roughly $3 million, but Young said he expects the total cost to be approximately $10-$15 million.

The use of AI in policing has been a topic of debate when it comes to policing for years now, particularly in discussions on effectiveness, especially concerning facial recognition technology - something that is not planned for roll-out in Memphis.

THE FLAWED ALGORITHM AND THE FLAWED HUMAN BEING

University of Pennsylvania emeritus professor of criminology Richard Berk said the broad use of AI in policing is something used in a variety of ways and at different times. But the use of the technology is dependent upon the quality of the work, as well as the addition of corroborating evidence.

“There’s not one way to know what’s being used and how,” Berk said. “I know that here in Pennsylvania that we're using software increasingly to analyze videos from body worn cameras, and some of that software has facial recognition in it.

“I'm on a committee for the legislature of Pennsylvania, asking this very question, like, what kind of software is being used to analyze videos for body worn cameras, and is facial recognition involved? And we don't have an answer to that yet, and that's just for Pennsylvania.”

But Berk described the use of the technology in policing as revolutionary as the introduction of the patrol car, if not more so.

“Remember the baseline for all AI, be it police or elsewhere, is current practices by humans on the same task, for example, in the identification of cancer tissues and lung X-rays,” Berk said. “Can an algorithm be more accurate than a radiologist? And the answer often is yes. Not that the algorithm [won’t] make some mistakes, but they make fewer mistakes than the radiologist. The goal is improvement, not perfection.”

Berk used an example of a break in near his home where a closed circuit camera captured an image of a man crossing the street and coming into a driveway. 

“The question is whether an algorithm would do better with that same image than a human looking at that closed circuit TV, and it wasn't tried in this case, because there wasn't any algorithm in play,” Berk said. “This was several years ago, but currently that's an open question, and I think in many cases, the flawed algorithm will do better than the flawed human being.”

Berk said it is known that humans make mistakes all the time, and that accuracy would still be a question without AI being used, noting stories of people serving false convictions for crimes they didn’t commit due to faulty identifications.

“The benchmark of human practice is not necessarily a very high bar,” Berk said.

Nearly 27 percent of exonerations are due to mistaken witness identifications and 29 percent are due to false or misleading forensic evidence, according to The National Registry of Exonerations.

Those accuracy issues will always be in place, Berk said, but the need for oversight procedures would help when adding in the use of AI technology into policing.

“There's big literature on how you set up a lineup,” Berk said. “How you set that up is really, really important, because if it's not set up properly, there's all kinds of leading cues, even if they're inadvertent, that leave the witness to make a false identification. [That] needs to be developed for these algorithms.”

Berk added that there are related issues such as privacy that are important and ultimately a legal question of what privacy means when these types of cameras are in place. Ultimately the question for the use of the technology is accuracy.

“Are we improving our ability to identify suspects, even though it's not perfect?” Berk said. “The answer is increasingly yes.”

IMPACT ON ARRESTS AND RACIAL DISPARITIES

Articles on both The Innocence Project and the American Civil Liberties Union websites highlighted false arrests that have been made when using AI technology for arrests without further corroborating evidence.

Georgia State University criminologist Thaddeus Johnson is a former Memphis Police Department officer. Johnson’s recent research has been on the use of AI in policing, and while there were positives gained from its use, there are still issues with its application.

“We’re not sure what type of camera systems they’re using, we’re not sure how these agencies are storing these images and these data, we’re also not sure how police departments are using them,” Johnson said. “And so a big problem is these places are not transparent in how they use it. If it’s not transparent in how you use it, you also weaken the deterrence effect."

“If you’re going to use this technology, what this study says, it can have great impacts and not be overly biased or lead to over policing, however it needs to be signed off on. You need to have certain policies in place.”

Johnson’s research found that the use of facial recognition technology contributed to reductions in violent crime, homicides and homicide rates. But Johnson’s research also found that the homicide rate drops were likely from arrests that enhanced deterrence.

The research also found that facial recognition technology application also did not increase violent arrest rates or racial disparities in those rates.

Johnson previously wrote an opinion piece for Scientific American on how AI facial recognition technologies struggle to identify Black faces.

That article was based on prior research by Johnson that showed facial recognition technologies contribute to greater racial disparities in arrests and that oversight and policy was needed if these are to be used. As previously stated, City of Memphis officials are not planning on implementing facial recognition technology.

“The technology since 2016 has gotten much, much better,” Johnson said. “However, it’s gotten better in clinical laboratory settings and so when you don’t have perfect lighting, where you don’t have perfect camera angles or people may be wearing facial masks, it is still performing worse, even though it’s better than in the past.”

Johnson said that without knowing if there is someone who really understands the technology or that certain oversight procedures are in place, it could lead to further over-policing and disproportionate contact with police.

“If you have police departments that are promoting traffic stops, a police department that is trying to have a mentality of running and gunning and making arrests and rewarding officers for that, then these technologies can codify already existing disparities,” Johnson said. “Whether it's good or great or not, that has a role. It's more about the operational biases that's already there, and what it's not going to do, it doesn't have the ability to correct those wrongs of the past. 

“But most likely, what we'll see if you're already operating the way that rewards low level arrest, if you're operating in a way where certain citizens are being overexposed to police, well, this technology can increase those things and make them worse. It's not a magic bullet.”

Johnson said that looking at places that also used this technology, such as New Orleans, people’s perceptions on crime did not necessarily change and there were disparities in images that were requested to be run, even if it didn’t lead to an arrest.

But Johnson said that if he were involved in decision making processes in a potential rollout of facial recognition technology, it would ultimately involve oversight and proper research.

“If you decide to implement it, and you have success stories or you have some failures, I think you need to make sure you're transparent and tell your citizens, because people are accepting of certain failings,” Johnson said. “I think it becomes scary when we're not knowing how it's being used and it's not being communicated. But I think having [a] civilian unit kind of focus on working with investigators, … and not necessarily police officers. That gives a level of balance and impartiality of sorts.

“[Mayor Young] wants to do something that that can be effective and and show this to people of Memphis that you know, that we're moving to this level of public safety and that they're doing something, but [he] can't allow it to be a tool that can be abused, even though many times, officers don't mean to, and so they just got to be really, really careful.”

Before You Leave, Check This Out