When a doll rats out a parent: Tech firms struggle with thorny privacy issues

When a doll rats out a parent: Tech firms struggle with thorny privacy issues

When a doll rats out a parent: Tech firms struggle with thorny privacy issues

The rise of intelligent devices, from wearables to smart home sensors to Internet-connected Barbie dolls, is confronting technology companies with a host of new ethical issues involving privacy and security.

That’s forcing them to think carefully in advance about principles they need to apply to a wide range of products for the emerging Internet of Things, says Jules Polonetsky, chief executive of the Future of Privacy Forum, Washington, D.C.-based think tank with 130 members from around the world, many of whom serve as chief privacy officers at big tech companies.

Polonetsky laid out the emerging ethical and business issues at a Data Privacy Day event held Thursday by the National Cyber Security Alliance at the San Francisco headquarters of Twitter Inc., a sponsor of the event. Data Privacy Day is an annual celebration to recognize the Jan. 28, 1981, signing of Convention 108, the first legally binding international treaty concerning privacy and data protection.

Polonetsky sat down with Jeff Frick (@JeffFrick), co-host of theCUBE, SiliconANGLE Media’s mobile video studio, to discuss the challenges his organization sees to privacy in the age of big data and how companies need to respond to those issues. This is one of a series of interviews with top executives and thought leaders at the event. The rest will run in coming days.

Read Also:
Microservices for Big Data: Flipping the Paradigm

Polonetsky said an overriding dilemma is that America, at least, is of two minds when it comes to privacy. “We don’t have clear consensus over whether we want the government keeping us safe by being able to catch every criminal, or not getting into our stuff because we don’t trust them.” As the opportunities to mine data from self-driving cars, wearable devices and more continue to grow, establishing fundamental principles for private companies to use in developing their technology becomes paramount.

“We say, ‘Listen, how can we have data that’ll make cars safer, how can we have wearables that’ll help improve fitness, but also have reasonable, responsible rules in place so that we don’t end up with discrimination or data breaches and all the problems that can come along?’” Polonetsky said.

The problem is that the very data that makes many new technologies work also potentially makes them threatening. For example, advertising technology makes it possible to offer a lot of media for free to consumers, but it also can collect more data about people than they like.

Read Also:
2017 Trends in Data Strategy


Read Also:
4 Ways to Shrink the Gap between Data Integration and Insight
Read Also:
Machine Learning and the Future of Artificial Intelligence
Big Data Innovation Summit London
30 Mar
Big Data Innovation Summit London

$200 off with code DATA200

Read Also:
What Exactly is Data Stewardship and Why Do You Need It?
Read Also:
What Exactly is Data Stewardship and Why Do You Need It?

Leave a Reply

Your email address will not be published. Required fields are marked *