Skip to content

OPLIN 4Cast #584: No psychics, but we’ll get to ‘Minority Report’ with computers

Posted in 4cast, and Algorithms

When I hear “machine learning,” my brain immediately goes to “artificial intelligence,” which movies like 2001 and The Terminator have led me to believe will turn against us. But while machine learning is based on theories from artificial intelligence, it has more in common with computational statistics and data mining: analysis that uses data to make predictions, and discover and improve the algorithms that lead to better predictions. I’ve written about a growing concern with algorithms before, but this week, a couple news stories clustered on “predictive policing” provide concrete examples of the way this tool can be used (violent crime rates are dropping in Chicago) and disturbingly misused (punishing people for crimes they haven’t committed).

I’ve come to worry less that the machines will turn against us, but that we are — as usual? –using the machines against ourselves.

  • CPD expands predictive policing technology, deploys 86 new officers [Chicago Sun-Times | Tom Schuba] “Shootings are down by nearly 34 percent this year compared to the same period last year in the districts that are currently using the tools, the statement said. Those numbers outpace the overall 28 percent reduction of citywide shootings so far this year.”
  • Palantir has secretly been using New Orleans to test its predictive policing technology [The Verge | Ali Winston] “Predictive policing technology has proven highly controversial wherever it is implemented, but in New Orleans, the program escaped public notice, partly because Palantir established it as a philanthropic relationship with the city through Mayor Mitch Landrieu’s signature NOLA For Life program.”
  • China using big data to detain people before crime is committed [The Globe and Mail | Nathan Vanderklippe] “Chinese police theorists have identified specific ‘extremist behaviours, which include if you store a large amount of food in your home, if your child suddenly quits school and so on,’ said Maya Wang, senior China researcher at Human Rights Watch]. Train a computer to look for such conduct, and ‘then you have a big data program modelled upon pretty racist ideas about peaceful behaviours.'”
  • How to Fight Bias with Predictive Policing [Scientific American | Eric Siegel] “Predictive policing uncovers racial inequity, which it threatens to perpetuate – but, if we turn things around, it also presents an unprecedented opportunity to advance social justice.”

From the Ohio Web Library:

Share