in

Predictive Policing Software Terrible at Predicting Crimes


In her 2019 master’s thesis for the Naval Postgraduate School, Ana Lalley, police chief of Elgin, Illinois, wrote critically about her department’s experience with the software, which left officers unimpressed. “Officers routinely question the prediction method,” she wrote. “Many believe that the awareness of crime trends and patterns they have gained through training and experience help them make predictions on their own that are similar to the software’s predictions.”

Lalley added that when the department brought those concerns to Geolitica, the company warned that the software “may not be particularly effective in communities that have little crime.” Elgin, a Chicago suburb, has about double Plainfield’s population.

“I think that what this shows is just how unreliable so many of the tools sold to police departments are,” says Dillon Reisman, founder of the American Civil Liberties Union of New Jersey’s Automated Injustice Project. “We see that all over New Jersey. There are lots of companies that sell unproven, untested tools that promise to solve all of law enforcement’s needs, and, in the end, all they do is worsen the inequalities of policing and for no benefit to public safety.”

David Weisburd, a criminologist who served as a reviewer on a 2011 academic paper coauthored by two of Geolitica’s founders, recalls approving their ideas around crime modeling at the time, but warns that inaccurate predictions can have their own negative externalities outside of wasting officers’ time.

“Predicting crimes in places where they don’t occur is a destructive issue,” Weisburd says. “The police are a service, but they are a service with potential negative consequences. If you send the police somewhere, bad things could happen there.”

One study found that adolescent Black and Latino boys stopped by police subsequently experienced heightened levels of emotional distress, leading to increased delinquent behavior in the future. Another study found higher rates of use of force in New York City neighborhoods led to a decline in the number of calls to the city’s 311 tip line, which can be used for everything from repairing potholes to getting help understanding a property tax bill.

“To me, the entire benefit of this type of analysis is using it as a starting point to engage police commanders and, when possible, community members in larger dialog to help understand exactly what it is about these causal factors that are leading to hot spots forming,” says Northeastern University professor Eric Piza, who has been a critic of predictive policing technology.

For example, the city of Newark, New Jersey, used risk terrain modeling (RTM) to identify locations with the highest likelihood of aggravated assaults. Developed by Rutgers University researchers, RTM matches crime data with information about land use to identify trends that could be triggering crimes. For example, the analysis in Newark showed that many aggravated assaults were occurring in vacant lots.

The RTM then points to potential environmental solutions that come from across local governments, not just police departments. A local housing organization used that New Jersey data to prioritize lots to develop for new affordable housing that could not only increase housing stock but also reduce crime. Other community groups used the crime-risk information to convert city-owned lots to well-lighted, higher-trafficked green spaces less likely to attract crime.

How information retrieval is being revolutionised with RAG technology

How to Use ChatGPT’s New Image Features