Racist prediction won’t stop crime

Law enforcement agencies around the country are using predictive policing software that uses the controversial, unproven “broken windows” theory and encourages racist practices.

PredPol is one such software package that uses a machine-learning algorithm to calculate predictions based on three data points – crime type, crime location and crime date/time.

Critics note that the PredPol software essentially aims police at locations where crimes happened in the past, which reinforces racial discrimination and stereotypes poor people.

“For us and our customers, it is the practice of identifying the times and locations where specific crimes are most likely to occur, then patrolling those areas to prevent those crimes from occurring,” says the company’s website. “Put simply, our mission is to help law enforcement keep communities safer by reducing victimization.”

However, civil rights activists argue that the software increases victimization of populations that are routinely sudjected to maltreatment in society.

PredPol does not collect, upload, analyze or involve any information about individuals or populations and their characteristics – but tge software technology still poses many personal privacy or profiling concerns.

Only reported crimes are counted, so communities that fear police intervention are less likely to be represented in the summation and the factors related to crime occurrences typically have commonality with racial traits.

It also creates an atmosphere of vengeance instead of justice as the primary motivation for law enforcement.

There are vast limitations to using AI technology but the apparent disadvantages inherent to predictive policing software start with the fallibility of human nature.

Kristian Lum, who co-wrote a 2016 paper that tested the algorithmic mechanisms of PredPol with real crime data, said that although the software is powered by complicated-looking mathematical formulas, its actual function can be summarized as a moving average—or an average of subsets within a data set.

A group of mathematicians is calling on their colleagues to refuse to work with law enforcement officials.

“In light of the extrajudicial murders by police of George Floyd, Breonna Taylor, Tony McDade and numerous others before them, and the subsequent brutality of the police response to protests, we call on the mathematics community to boycott working with police departments,” said a letter signed by academic authorities.

“There are also deep concerns about the use of machine learning, AI, and facial recognition technologies to justify and perpetuate oppression,” it also said. “Given the structural racism and brutality in US policing, we do not believe that mathematicians should be collaborating with police departments in this manner.”

The rich and powerful often exploit people in far more sinister and pervasive ways than criminals but the laws allow actions that are legitimatized by institutional rules.

Petty theft costs far less than the trillions of dollars diverted from working class income to profit for stock owners over the past 40 years, but greed is only criminal when its done by less affluent individuals.

Computer aided analysis only focuses attention on what is already defined by society as crime instead of accurately identifying what is genuinely wrong.

Since our economy and political establishment is clearly broken, predictive policing software can only reinforce injustice rather than correct corruption.


Connect with NJTODAY.NET


Join NJTODAY.NET's free Email List to receive occasional updates delivered right to your email address!
Email ads@njtoday.net for advertising information Send stuff to NJTODAY.NET Like Us On Facebook Follow Us On Twitter Download this week's issue of NJTODAY.NET
Print Friendly, PDF & Email