Technology is not a bad thing; it’s not inherently scary. Sometimes new technology gets misused or tainted with mission creep. Most of the time, tech actually makes our lives easier and better. Here are two tales about “new” tech that could potentially predict the future. One seems scarier than the other.
Algorithm to predict at birth if a person will be a criminal
Algorithms control aspects of your life whether you are aware of it or not. They are used to come up with risk scores and even predict the future. But how would you feel about an algorithm that seems to be ripped straight from Minority Report? It would identify criminals far before they could commit a crime since it would “predict at the time of someone’s birth how likely she is to commit a crime by the time she turns 18.”
Richard Berk, a statistician and University of Pennsylvania professor, is working on such an algorithm, according to Bloomberg. He believes it can “predict at the moment of birth whether people will commit a crime by their 18th birthday, based on factors such as environment and the history of a new child’s parents.”
There are algorithms used “to decide which blocks police officers should patrol, where to put inmates in prison, and who to let out on parole.” These are suggested to be fairer than when the decision is left to individuals with prejudices or judges having a bad day. People who support such tools claim they will do away with prejudices, but critics say the prejudices are built into the algorithms.
Berk said, “If you want me to do a totally race-neutral forecast, you’ve got to tell me what variables you’re going to allow me to use, and nobody can, because everything is confounded with race and gender.”
Bloomberg reported that Berk’s algorithms “have been used by prisons to determine which inmates to place in restrictive settings; parole departments to choose how closely to supervise people being released from prison; and police officers to predict whether people arrested for domestic violence will re-offend.” He’s even come up with a tool for OSHA to determine “which workplaces were likely to commit safety violations.” Pennsylvania will start a pilot program in the fall that uses Berk’s algorithm to decide how long to sentence a person to prison.
Since Berk’s system will be used for sentencing decisions, it was troubling to read that 29 to 39 percent of risk score predictions are wrong; Berk said “focusing on accuracy misses the point.” Um, a person who was labeled a future criminal would likely disagree, but a future criminal that was decided to be a low risk would be cool with it.
When talking about crime predictions, he said, “The policy position that is taken is that it’s much more dangerous to release Darth Vader than it is to incarcerate Luke Skywalker.
Data Innovation Summit 2017
30% off with code 7wData
Big Data Innovation Summit London
$200 off with code DATA200
Enterprise Data World 2017
$200 off with code 7WDATA
Data Visualisation Summit San Francisco
$200 off with code DATA200
Chief Analytics Officer Europe
15% off with code 7WDCAO17