Does big data have a discrimination problem? Some say…maybe

Does big data have a discrimination problem? Some say…maybe

Critics allege big data can be discriminatory, but is it really bias?
Getty I 187131740
Big data is increasingly viewed as a strategic asset that can transform organizations through its use of powerful predictive technologies.
But when it comes to systems that help make such decisions, the methods applied may not always seem fair and just to some, according to a panel of social researchers who study the impact of big data on public and society .
The event, organized recently by New York University's Politics Society and Students for Criminal Justice Reform, centered on issues arising out of big data's use in machine learning and data mining to drive public and private sector executive decisions.
The panel that included a mix of policy researchers, technologists, and journalists, discussed ways in which big data—while enhancing our ability to make evidence-based decisions—does so by inadvertently setting rules and processes that may be inherently biased and discriminatory.
The rules, in this case, are algorithms , a set of mathematical procedures coded to achieve a particular goal. Critics argue these algorithms may perpetuate biases and reinforce built-in assumptions.

Drive on! Using big data like 'a car engine'

Government agencies have recently begun scrutinizing the ethical implications of the emerging field. Last week, a White House report cautioned that data collection could undermine civil rights , if not applied correctly. The report called for a conversation to determine "how best to encourage the potential of these technologies while minimizing risks to privacy, fair treatment, and other core American values."
In his 2014 report titled "Big Data's Disparate Impact", Solon Barocas, a panel member and a research fellow with the Center for Information Technology at Princeton University, points out that "advocates of algorithmic techniques like data mining argue that they eliminate human biases from the decision-making process. But an algorithm is only as good as the data it works with."
Barocas studies the impact of emerging applications of machine learning and the ethical and epistemological issues that they raise. He added that "data mining can inherit the prejudices of prior decision-makers or reflect the widespread biases that persist in society at large."
In other words, machine learning systems that run on data produced by humans are based on algorithms designed by humans. Hence that very data carries the implicit biases of the humans that create it.

Read Also:
Is Self Service Analytics The Key To True Data Democratization?


Read Also:
Moving mountains of data: Can humans learn from the failures of new technologies?
Read Also:
Emerging Jobs: Predictive Analytics Professionals in Demand
Big Data Innovation Summit London
30 Mar
Big Data Innovation Summit London

$200 off with code DATA200

Read Also:
Democratizing BI to Give SMBs Access to Complex Data Analytics
Read Also:
Why self-service analytics is replacing traditional business intelligence

Leave a Reply

Your email address will not be published. Required fields are marked *