Make Algorithms Accountable

Make Algorithms Accountable

Make Algorithms Accountable

Algorithms are ubiquitous in our lives.

They map out the best route to our destination and help us find new music based on what we listen to now. But they are also being employed to inform fundamental decisions about our lives.

Companies use them to sort through stacks of résumés from job seekers. Credit agencies use them to determine our credit scores. And the criminal justice system is increasingly using algorithms to predict a defendant’s future criminality.

Those computer-generated criminal “risk scores” were at the center of a recent Wisconsin Supreme Court decision that set the first significant limits on the use of risk algorithms in sentencing. The court ruled that while judges could use these risk scores, the scores could not be a “determinative” factor in whether a defendant was jailed or placed on probation. And, most important, the court stipulated that a presentence report submitted to the judge must include a warning about the limits of the algorithm’s accuracy.

Read Also:
How Artificial Intelligence and the robotic revolution will change the workplace of tomorrow

This warning requirement is an important milestone in the debate over how our data-driven society should hold decision-making software accountable. But advocates for big data due process argue that much more must be done to assure the appropriateness and accuracy of algorithm results. An algorithm is a procedure or set of instructions often used by a computer to solve a problem.

Many algorithms are secret. In Wisconsin, for instance, the risk-score formula was developed by a private company and has never been publicly disclosed because it is considered proprietary. This secrecy has made it difficult for lawyers to challenge a result. The credit score is the lone algorithm in which consumers have a legal right to examine and challenge the underlying data used to generate it.

In 1970, President Richard M. Nixon signed the Fair Credit Reporting Act. It gave people the right to see the data in their credit reports and to challenge and delete data that was inaccurate. For most other algorithms, people are expected to read fine-print privacy policies, in the hopes of determining whether their data might be used against them in a way that they wouldn’t expect.

Read Also:
Do you need a Chief Artificial Intelligence Officer (CAIO)?

“We urgently need more due process with the algorithmic systems influencing our lives,” says Kate Crawford, a principal researcher at Microsoft Research who has called for big data due process requirements. “If you are given a score that jeopardizes your ability to get a job, housing or education, you should have the right to see that data, know how it was generated, and be able to correct errors and contest the decision.



Data Science Congress 2017

5
Jun
2017
Data Science Congress 2017

20% off with code 7wdata_DSC2017

Read Also:
7 Things to Look before Picking Your Data Discovery Vendor

AI Paris

6
Jun
2017
AI Paris

20% off with code AIP17-7WDATA-20

Read Also:
Data Storytelling: What It Is, Why It Matters

Chief Data Officer Summit San Francisco

7
Jun
2017
Chief Data Officer Summit San Francisco

$200 off with code DATA200

Read Also:
Developing a data ethics framework in the age of AI

Customer Analytics Innovation Summit Chicago

7
Jun
2017
Customer Analytics Innovation Summit Chicago

$200 off with code DATA200

Read Also:
Developing a data ethics framework in the age of AI
Read Also:
Data Storytelling: What It Is, Why It Matters

HR & Workforce Analytics Innovation Summit 2017 London

12
Jun
2017
HR & Workforce Analytics Innovation Summit 2017 London

$200 off with code DATA200

Read Also:
How to Intelligently Apply Data Integration and Visual Analytics Tools

Leave a Reply

Your email address will not be published. Required fields are marked *