Natural Language Processing Poised to Have a Big Impact on the Data Economy


The stakes are large in the Natural Language Processing (NLP) market: It’s the high ground in the battle for control of the data economy and the key to turning silicon into gold, according to a report issued this quarter from market intelligence firm Tractica.

Report authors Bruce Daley, the Principal Analyst, and Clint Wheelock, the Managing Director, cite as an example the upcoming Facebook virtual digital assistant Moneypenny. Noting that Facebook already turns an average daily revenue of $11.96 per active user, they surmise that if the same number of active daily users adopt Moneypenny and generate just an additional $1 in additional ad revenue, the program would add $1 billion to the company’s annual top line and almost as much to its bottom line.

With these and other tantalizing economic prospects at hand, is it any wonder that big names from Amazon to Apple and from Google to Microsoft – not to mention IBM, Nuance, AT&T, and others – are driving further research into NLP and acquiring companies specializing in the space? This month saw another Natural Language Processing acquisition take place, as well-known CRM vendor SugarCRM purchased Contastic’s NLP technology.  The platform analyzes communications in emails, LinkedIn, and other sources between salespeople and their contacts to keep their relationships on track, helping the former send the latter appropriate content. “SugarCRM will use Contastic’s NLP technology to analyze data within the Sugar platform so users can automatically send personalized content [to customers],” CEO Larry Augustin said in a statement about the acquisition.

Read Also:
The art of designing data flow on a free-form canvas

Businesses with NLP in Their Sights

Natural Language Processing, the Tractica report notes, is arguably the most leveraged technology in Artificial Intelligence (AI), and that it is improving rapidly thanks to advances in related technologies, such as Deep Learning and Cognitive Computing. It is already providing a competitive advantage to businesses in the fields of digital ad services, legal, and media. Other areas – automotive, healthcare, education and retail – are likely to become invested in the technology as new business models in those sectors take shape and as – or if – NLP continues to evolve “to correctly interpret and adapt to the wide variety of human language and become engaging in the process” as virtual digital assistants of one type or another.

The report’s authors envision that the day may come when “applications that are both completely personalized and yet generic to produce” will result in shoppers turning to Amazon Alexa to answer product and inventory questions; individuals reaching out to Microsoft Cortana provides to gain investment advice; students being tutored by Apple Siri or schizophrenia being diagnosed by IBM Watson.

Read Also:
Splunk adds machine learning that's both easy and open

Indeed, a report published by MarketsandMarkets last year positions the global healthcare and life sciences NLP market to see a CAGR of close to 20 percent between 2015 and 2020. Evidence of traction this year alone in the healthcare area, for example, comes by way of eviCore healthcare’s acquisition of QPID Health. It is technology that includes NLP, clinical logic, and Machine Learning for generating patient facts from information found in any records in any format – including unstructured notes – to improve understanding of a patient’s history. IBM Watson Health also took another step forward with the announcement it’s acquiring Truven Health Analytics to add to its data set on patients and health that Watson can ingest to help brings medical insights to physicians.

A Take on the NLP Movement

Tractica forecasts that annual revenue of NLP software bought by enterprises for their own internal applications will increase from less than $30 million worldwide in 2015 to over $200 million in 2024, and that total NLP hardware, software, and services will amount to $2.1 billion by 2024.

Read Also:
How Machine Learning Makes Databases Ready for Big Data

Read Full Story…


Leave a Reply

Your email address will not be published. Required fields are marked *