U.S. Police Are Getting a Crime Prediction Map Straight Out of 'Minority Report'

U.S. Police Are Getting a Crime Prediction Map Straight Out of 'Minority Report'
Source: AP
Source: AP

The movie-turned-TV-show Minority Report imagines a future in which algorithms and data help law enforcement pick targets before they've ever committed a crime. Today, that possibility isn't so futuristic — it's right around the corner.

Hitachi, a company famous for making air conditioners, power tools and "personal massagers," has designed a system called Hitachi Visualization Predictive Crime Analytics. Hitachi will roll out the system to half a dozen U.S. cities by the end of October, reports Fast Company.

Hitachi hasn't confirmed which cities will receive their trial software, but Washington, D.C., shows up on the screenshots of its program, and New York has shown interest in predictive technology in the past.

Source: Hitachi

Hitachi's system builds on the same models for crime prediction that police have always used — like prior experience and local histories of reported crimes — with data culled from social media like Twitter. Twitter has been shown in early studies in Chicago to improve crime prediction.

"A human just can't handle when you get to the tens or hundreds of variables that could impact crime," Hitachi CTO Darrin Lipscomb told Fast Company, "like weather, social media, proximity to schools, Metro stations, gunshot sensors, 911 calls."

Source: Hitachi

The problem with pre-crime prediction models: They allow for increased profiling. Designing a system to identify at-risk neighborhoods and populations by criteria police already use could just lead to an amplification of the biases and discriminations that lead to overpolicing in the first place.

Hitachi execs argue that their system would allow police to allocate resources to where crime is actually about to happen, not double down on overly profiled areas. But as we've seen with machine-learning algorithms like Facebook's News Feed or Spotify's "Discover" playlists, predictive models tend to reinforce the behaviors we've always known.

In this case, that could mean cold, algorithmic justifications for over-policing neighborhoods and populations that have enough law enforcement as it is.