Using Big Data to Predict Terrorist Acts Amid Privacy Concerns

Using Big Data to Predict Terrorist Acts Amid Privacy Concerns

Using Big Data to Predict Terrorist Acts Amid Privacy Concerns

Before Ahmad Khan Rahami planted bombs in New York and New Jersey, he bought bomb-making materials on eBay, linked to jihad-related videos from his public social-media account and was looked into by law enforcement agents, according to the Federal Bureau of Investigation.

If only the authorities had connected the dots.

That challenge — mining billions of bits of information and crunching the data to find crucial clues — is behind a push by U.S. intelligence and law enforcement agencies to harness “big data” to predict crimes, terrorist acts and social upheaval before they happen. The market for such “predictive analytics” technology is estimated to reach $9.2 billion by 2020, up from $3 billion in 2015, according to research firm MarketsandMarkets.

It’s the stuff of a science-fiction movie like “Minority Report,” in which Tom Cruise played a Washington cop who used technology to arrest people before they carried out crimes. It’s also a red flag for privacy advocates already fighting U.S. spy programs exposed by Edward Snowden and the FBI’s demands that Apple Inc. help it hack into encrypted mobile phones.

The idea is to make sense of the vast and disparate streams of data from sources including social media, GPS devices, video feeds from street cameras and license-plate readers, travel and credit-card records and the news media, as well as government and propriety systems.

Read Also:
How machine learning is ushering in a new age of customer service

“Data is going to be the fundamental fuel for national security in this century,” William Roper, director of the Defense Department’s strategic capabilities office, said at a conference in Washington last month.

For the first time, the White House released a strategic plan on Wednesday to advance research and development of artificial intelligence technology, including to predict incidents that may be dangerous to public safety.

Weeks before Rahami allegedly carried out the attacks in September, he bought circuit boards, electric igniters and ball bearings — all of which are known bomb-making materials, according to charging documents from the FBI.

In previous years, he was flagged by U.S. Customs and Border Protection and the FBI after he made trips to Pakistan and after his father told police he was a terrorist, before recanting the remark.

Law enforcement agents could have been tipped off that Rahami was moving toward an attack had all of those data points been culled together in one place, said Mark Testoni, chief executive officer and president of SAP National Security Services Inc., a U.S.-based subsidiary of German software company SAP SE.

Read Also:
The Dynamic Future of Customer Service: How Machine Learning Will (Finally) Make Business Personal

“This is a big data world now,” said Testoni. He said his company has developed a computer platform for doing predictive analytics that is being used in a limited way by a Defense Department agency and by a national security agency. He declined to name the government customers or specify what they are doing.

The technology to predict events is only in its infancy, Testoni said. National security and law enforcement agencies also have different rules when it comes to obtaining and using data, meaning there are walls between what can be accessed and shared, he said. U.S. law enforcement agencies, for example, need a court warrant to access most data.

Privacy advocates express concern about the “Big Brother” implications of such massive data-gathering, calling for more information and public debate about how predictive technology will be used.

“There’s often very little transparency into what’s being brought into the systems or how it’s being crunched and used,” said Rachel Levinson-Waldman, senior counsel to the National Security Program at the Brennan Center for Justice at New York University School of Law. “That also makes it very hard to go back and challenge information that might be incorrect.”

Read Also:
The company that perfects Data Visualization in Virtual Reality will be the next Unicorn

Computer algorithms also fail to understand the context of data, such as whether someone commenting on social media is joking or serious, Levinson-Waldman said.

 



Chief Analytics Officer Europe

25
Apr
2017
Chief Analytics Officer Europe

15% off with code 7WDCAO17

Read Also:
The Dynamic Future of Customer Service: How Machine Learning Will (Finally) Make Business Personal

Chief Analytics Officer Spring 2017

2
May
2017
Chief Analytics Officer Spring 2017

15% off with code MP15

Read Also:
The Future of Information: Analytics Everywhere

Big Data and Analytics for Healthcare Philadelphia

17
May
2017
Big Data and Analytics for Healthcare Philadelphia

$200 off with code DATA200

Read Also:
The Data Science Behind AI

SMX London

23
May
2017
SMX London

10% off with code 7WDATASMX

Read Also:
Henri Eliot: Is big data a governance issue?

Data Science Congress 2017

5
Jun
2017
Data Science Congress 2017

20% off with code 7wdata_DSC2017

Read Also:
How machine learning is ushering in a new age of customer service

Leave a Reply

Your email address will not be published. Required fields are marked *