Tech leaders must act quickly to ensure algorithmic fairness

Tech leaders must act quickly to ensure algorithmic fairness

Tech leaders must act quickly to ensure algorithmic fairness
Do big data algorithms treat people differently based on characteristics like race, religion, and gender? Mary Worth in her new book Weapons of Math Destruction and Frank Pasquale in The Black Box Society both look closely and critically at concerns over discrimination, the challenges of knowing if algorithms are treating people unfairly and the role of public policy in addressing these questions.

Tech leaders must take seriously the debate over data usage — both because discrimination in any form has to be addressed, and because a failure to do so could lead to misguided measures such as mandated disclosure of algorithmic source code.

What’s not in question is that the benefits of the latest computational tools are all around us. Machine learning helps doctors diagnose cancer, speech recognition software simplifies our everyday interactions and helps those with disabilities, educational software improves learning and prepares children for the challenges of a global economy, new analytics and data sources are extending credit to previously excluded groups. And autonomous cars promise to reduce accidents by 90 percent.

Read Also:
We’ve given artificial intelligence too much power far too quickly

Jason Furman, the Chairman of the Council of Economic Advisors, got it right when he said in a recent speech that his biggest worry about artificial intelligence is that we do not have enough of it.

Of course, any technology, new or old, can further illegal or harmful activities, and the latest computational tools are no exception. But, in the same regard, there is no exception for big data analysis in the existing laws that protect consumers and citizens from harm and discrimination.

The Fair Credit Reporting Act protects the public against the use of inaccurate or incomplete information in decisions regarding credit, employment, and insurance.  While passed in the 1970s, this law has been effectively applied to business ventures that use advanced techniques of data analysis, including the scraping of personal data from social media to create profiles of people applying for jobs.

Further, no enterprise can legally use computational techniques to evade statutory prohibitions against discrimination on the basis of race, color, religion, gender and national origin in employment, credit, and housing.

Read Also:
IoT is not about radios; it’s all about data

Read Full Story…


Leave a Reply

Your email address will not be published. Required fields are marked *