EU Data Protection Law May End The Unknowable Algorithm

EU Data Protection Law May End The Unknowable Algorithm

EU Data Protection Law May End The Unknowable Algorithm

Slated to take effect as law across the EU in 2018, the General Data Protection Regulation could require companies to explain their algorithms to avoid unlawful discrimination.

Europe's data protection rules have established a "right to be forgotten," to the consternation of technology companies like Google that have built businesses on computational memory. The rules also outline a "right to explanation," by which people can seek clarification about algorithmic decision that affect them.

In a paper published last month, Bryce Goodman, Clarendon Scholar at the Oxford Internet Institute, and Seth Flaxman, a post-doctoral researcher in Oxford's Department of Statistics, describe the challenges this right poses to businesses and the opportunities it presents to machine learning researchers to design algorithms that are open to evaluation and scrutiny.

The rationale for requiring companies to explain their algorithms is to avoid unlawful discrimination. In his 2015 book The Black Box Society, University of Maryland law professor Frank Pasquale describes the problem with opaque programming.

Read Also:
How Trifacta built the human-AI interface for data wranglers

"Credit raters, search engines, major banks, and the TSA take in data about us and convert it into scores, rankings, risk calculations, and watch lists with vitally important consequences," Pasquale wrote. "But the proprietary algorithms by which they do so are immune from scrutiny."

Several academic studies have already explored the potential for algorithmic discrimination.

A 2015 study by researchers at Carnegie Mellon University, for example, found that Google showed ads for high income jobs to men more frequently than to women.

That's not to say Google did so intentionally. But as other researchers have suggested, algorithmic discrimination can be an unintended consequence of reliance on inaccurate or biased data.

Google did not immediately respond to a request to discuss whether it changed its advertising algorithm in response to the research findings.

A 2014 paper from the Data & Society Research Institute echoes the finding that inappropriate algorithmic bias tends to be inadvertent. It states, "Although most companies do not intentionally engage in discriminatory hiring practices (particularly on the basis of protected classes), their reliance on automated systems, algorithms, and existing networks systematically benefits some at the expense of others, often without employers even recognizing the biases of such mechanisms."

Read Also:
Driving Data Science Productivity Without Compromising Quality

Between Europe's General Data Protection Rules (GDPR), which are scheduled to take effect in 2018 and existing regulations, companies would do well to pay more attention to the way they implement algorithms and machine learning.

But adhering to the rules won't necessarily be easy, according to Goodman and Flaxman.



Data Science Congress 2017

5
Jun
2017
Data Science Congress 2017

20% off with code 7wdata_DSC2017

Read Also:
How Trifacta built the human-AI interface for data wranglers

AI Paris

6
Jun
2017
AI Paris

20% off with code AIP17-7WDATA-20

Read Also:
How Master Data Management Can Help Sales and Build Brand

Chief Data Officer Summit San Francisco

7
Jun
2017
Chief Data Officer Summit San Francisco

$200 off with code DATA200

Read Also:
How to Get Real-time Insight with Machine Learning and Centralized Data

Customer Analytics Innovation Summit Chicago

7
Jun
2017
Customer Analytics Innovation Summit Chicago

$200 off with code DATA200

Read Also:
Are You Monetizing Information?

HR & Workforce Analytics Innovation Summit 2017 London

12
Jun
2017
HR & Workforce Analytics Innovation Summit 2017 London

$200 off with code DATA200

Read Also:
4 Ways to Shrink the Gap between Data Integration and Insight
Read Also:
A survival guide for the coming AI revolution

Leave a Reply

Your email address will not be published. Required fields are marked *