Police departments in major American cities are trying to prevent violent crime with the help of an unlikely ally: data mining.
In recent years, law enforcement agencies have been experimenting with “predictive policing,” a tool that harnesses computer algorithms to identify individuals likely to commit crimes, The New York Times reports.
It may sound a little bit like “Minority Report,” a fictional future where psychics help police catch criminals right before they commit their crimes, but the reality is a bit more mundane.
The Times story details one way in which predictive policing is implemented in Kansas City: “call-ins.” Armed with information on recent parolees, high-crime neighbourhoods, and personal networks both off- and online — plus more anecdotal information like rumours that police hear on the street — algorithms detect individuals suspected to be influential in criminal groups.
Those individuals are then called in, as a group, to meet with police officials as well as “local and federal prosecutors, plus the police chief and the mayor.”
The officials warn that future violent offenses by the attendees, or even their associates, will be punished harshly. And they have got examples to back it up — like a man who was caught with a bullet in his pocket after receiving a call-in warning. He ended up with a 15-year prison sentence.
Police hope that these warnings will trickle down from the suspected leaders to the people they influence.
Predictive policework is rooted in complex mathematical models, but the basic premise is actually quite simple. A foundational paper on modelling crime compares crime to earthquakes to explain the rationale.
Just as earthquakes tend to lead to more earthquakes nearby and in the near future, gang retaliations, serial offenders, and repeated burglaries on a single location tend to create clusters of criminal offences that, with the right algorithms, police can forecast.
Previous predictive policing methods primarily focused on finding locations where crimes were likely to occur. A report from the nonprofit RAND Corporation, however, suggests that predictive policing can help forsee not only the location, but the times of crimes as well as individuals likely to commit future offenses. It can even predict those likely to be victims of crimes.
New York City Police Commissioner William Bratton even called predictive policing “the wave of the future” and told a panel on “Data Mining the Modern City” that “the ‘Minority Report’ of 2002 is the reality of today.”
But predictive policing is new — Time magazine called it one of the best inventions of 2011 — and the preliminary numbers are hard to believe. According to The Times story, homicide rates have taken a slight dip in Kansas City since they began their predictive No Violence Alliance (NoVA) about five years ago. A significant drop was seen in 2014, but it is difficult to say that the change came about directly because of the program.
John S. Hollywood, one of the authors of the RAND report, told The Times that the few studies on predictive policing have shown only a 5 or 10 per cent increase in crime forecasting compared to “regular policing methods.”
The RAND report claims that predictive policing lets departments with limited resources use those resources more efficiently, and, according to The Times, advocates of the programs believe that “predictive policing can help improve police-community relations by focusing on the people most likely to become involved in violent crime.”
Critics have claimed that predictive policing can easily lead to racial profiling. The fear is that over-eager modelling could lead to even more false-positives than before. Speaking with Fox News Latino, Hanni Fakhoury, an attorney for the Electronic Frontier Foundation, said that algorithms given selective data will create a “self-fulfilling prophecy.”
“If the data is biased to begin with and based on human judgment,” Fakhoury said, “then the results the algorithm is going to spit out will reflect those biases.” The same Fox report notes that, of 685,000 police stops in New York in 2011, 87 per cent of the people stopped were Black or Latino. Foster Maer, a lawyer for a New York Latino advocacy group, told Fox that “because the data is racially biased, the names that come out will be racially biased. Predictive Policing will replicate and expand the program at an individual level.”
Despite the uncertain future of predictive policing, The Times reports that the technique is being employed in a number of police departments and district attorneys’ offices across the country, including Kansas City, Los Angeles, Miami, Nashville, Manhattan, and Philadelphia.
NOW WATCH: Police are pleading with Google to ditch a feature in its Waze app that could help terrorists
Business Insider Emails & Alerts
Site highlights each day to your inbox.