Predictive Policing: Fewer Crimes or Data-Driven Racial Profiling?
In the short story and film “The Minority Report,” a system called Precrime arrests people before they have the opportunity to commit a specific felony. Today, this science fiction may be closer to reality, as police departments in some communities are tapping Cloud-based software to predict where and when a crime is likely to occur.
“To us, ideally, you stop crime before it happens,” said Brian MacDonald, CEO of PredPol, a predictive policing system used by more than 60 police departments across the country. “There’s no victim, and it’s a win-win for everyone.”
PredPol, like other predictive policing systems, works in the Cloud using a remote data center that is accessible to police officers on any web-based device. MacDonald calls it “data-driven, or intelligence-led, policing.” Using statistical crime data and a machine-learning algorithm that is re-trained every six months, the system creates predictions about where, when and what type of crime is likely to happen.
Here’s how it works: A police officer working a beat might have down time after a call and can refer to the system in the Cloud, which would predict an area in the community where a crime is statistically more likely to occur during that time of day or night. The police officer could then patrol the area.
“We are creating predictions not based on bias, race or demographics,” said MacDonald. “We let the crime data speak for itself — this is where objectivity comes from. We try to allow police departments to sidestep questions of bias.”
Some civil rights organizations disagree. Upturn, a team of technologists and policy analysts in Washington, D.C., partners with civil rights groups to guide them in the world of digital technology. In August, it released a report on predictive policing systems and the potential impact on civil rights, along with a statement of concerns signed by 17 organizations, including the American Civil Liberties Union and The Leadership Conference on Civil and Human Rights.
According to the Upturn report, because the predictive policing systems rely on crime data supplied by the police departments, the result can be that the Cloud merely reflects back like a mirror the data the police already have.
“The danger that civil rights groups have recognized is that if we pretend that the statistics are objective, the result is biased patterns,” said Upturn Principal and Co-Founder David Robinson.
The systems, Robinson continued, are optimized for reductions in reported crimes. But the Cloud doesn’t take into account “the cost impact to the community” if there is an increase in demeaning stops and searches.
“It’s not measuring community trust. The computer doesn’t care. With that kind of blindness, these systems can do unnoticed harm,” Robinson said.
In fact, he argued that the magic of the Cloud and its ability to provide access to this data anywhere at very little cost also makes predictive policing especially challenging from a civil rights perspective. Police departments, he said, are easily incorporating the predictive systems into their budgets with little public commentary.
“The community needs a voice in figuring out what they do and don’t want,” said Robinson. “Being democratic about it calls for public discussion and debate. … Democracy means that it shouldn’t be a secret.”
Predictive policing advocates counter that they are effectively disrupting opportunities for crime.
“We won’t prevent every crime,” said MacDonald, “but we will reduce opportunity and decrease risk enough that the overall crime rate drops. We are not about catching criminals but preventing crimes. Our goal, is to make communities safer.”