EU Data Protection Law May End The Unknowable Algorithm

EU Data Protection Law May End The Unknowable Algorithm

EU Data Protection Law May End The Unknowable Algorithm

Slated to take effect as law across the EU in 2018, the General Data Protection Regulation could require companies to explain their algorithms to avoid unlawful discrimination.

Europe's data protection rules have established a "right to be forgotten," to the consternation of technology companies like Google that have built businesses on computational memory. The rules also outline a "right to explanation," by which people can seek clarification about algorithmic decision that affect them.

In a paper published last month, Bryce Goodman, Clarendon Scholar at the Oxford Internet Institute, and Seth Flaxman, a post-doctoral researcher in Oxford's Department of Statistics, describe the challenges this right poses to businesses and the opportunities it presents to machine learning researchers to design algorithms that are open to evaluation and scrutiny.

The rationale for requiring companies to explain their algorithms is to avoid unlawful discrimination. In his 2015 book The Black Box Society, University of Maryland law professor Frank Pasquale describes the problem with opaque programming.

Read Also:
What Is Adaptive Analytics? And Why Does It Matter For B2B Marketers?

"Credit raters, search engines, major banks, and the TSA take in data about us and convert it into scores, rankings, risk calculations, and watch lists with vitally important consequences," Pasquale wrote. "But the proprietary algorithms by which they do so are immune from scrutiny."

Several academic studies have already explored the potential for algorithmic discrimination.

A 2015 study by researchers at Carnegie Mellon University, for example, found that Google showed ads for high income jobs to men more frequently than to women.

That's not to say Google did so intentionally. But as other researchers have suggested, algorithmic discrimination can be an unintended consequence of reliance on inaccurate or biased data.

Google did not immediately respond to a request to discuss whether it changed its advertising algorithm in response to the research findings.

A 2014 paper from the Data & Society Research Institute echoes the finding that inappropriate algorithmic bias tends to be inadvertent. It states, "Although most companies do not intentionally engage in discriminatory hiring practices (particularly on the basis of protected classes), their reliance on automated systems, algorithms, and existing networks systematically benefits some at the expense of others, often without employers even recognizing the biases of such mechanisms."

Read Also:
Spatial Data Heats Up Business Apps

Between Europe's General Data Protection Rules (GDPR), which are scheduled to take effect in 2018 and existing regulations, companies would do well to pay more attention to the way they implement algorithms and machine learning.

But adhering to the rules won't necessarily be easy, according to Goodman and Flaxman.



Chief Analytics Officer Europe

25
Apr
2017
Chief Analytics Officer Europe

15% off with code 7WDCAO17

Read Also:
Unlocking the power of data in sales

Chief Analytics Officer Spring 2017

2
May
2017
Chief Analytics Officer Spring 2017

15% off with code MP15

Read Also:
Amazon Web Services fires back at rivals with new cloud AI and database features

Big Data and Analytics for Healthcare Philadelphia

17
May
2017
Big Data and Analytics for Healthcare Philadelphia

$200 off with code DATA200

Read Also:
Manage Deploys MemSQL for Real-Time Analytics

SMX London

23
May
2017
SMX London

10% off with code 7WDATASMX

Read Also:
Amazon Web Services fires back at rivals with new cloud AI and database features
Read Also:
How data science is changing the face of human rights

Data Science Congress 2017

5
Jun
2017
Data Science Congress 2017

20% off with code 7wdata_DSC2017

Read Also:
Unlocking the power of data in sales

Leave a Reply

Your email address will not be published. Required fields are marked *