Enterprises today rely on data as the foundation of business success, whether the goal is to better understand customers, build new or better products and services, or manage cost and risk. Data is now the prime raw material for creating value; across all industries, it’s the norm to hold vast stores of data.
An issue that remains unresolved, however, is how well and how efficiently data can be applied.
Firms are still wrestling with the challenge of making big data work for them, in use cases ranging from enterprise analytics, customer 360, and product personalization to revenue assurance and fraud detection. All the data in the world has no value unless it’s accessible and actionable. Transforming modern data volumes into usable information requires new approaches to analytics.
Big data tends to held in silos – billing, credit history, customer transactions, marketing. Some organizations warehouse their data after just a few months, which makes it very difficult to access. You can’t get a 360-degree view of a customer without looking across all areas of her interactions with an organization, and you need to analyze transactions over a time frame of 13 months in order to understand typical activity over a year. Data is of far more value and utility when held centrally.
As data volumes grow at exponential rates and new types of data become available every day, users demand more and faster access. It also becomes increasingly burdensome to move all that data around for each new business question or use case. At the same time, IT must ensure SLA performance, control costs, and manage security and compliance.
Many organizations realize their existing systems alone are not sufficient to keep pace with this rate of change and turn to a new approach to complement their existing investments: an enterprise data hub. As a unified platform that can economically store unlimited data and enable diverse access to it at scale, the enterprise data hub is emerging as the architectural center of a modern data strategy.
The enterprise data hub does much to overcome the data silo issue. The evolution of Hadoop and Hadoop-based enterprise data platforms such as Cloudera and Hortonworks have been key to the emergence of the enterprise data hub, transforming the economics, scalability, and flexibility of storing and using massive amounts of data.
There still remains the thorny problem of how to access, analyze, and leverage the data to optimize business opportunity. Communications service providers (CSPs) are paving the way in this regard by using real-time, machine-learning applications to detect and prevent fraud and bolster financial results.
Revenue assurance and fraud prevention and detection are major focus areas for CSPs. Fraud alone is a $38 billion a year problem for them, as evidenced by the Communications Fraud Control Association’s 2015 survey. Operators worldwide are experiencing a huge surge in high-velocity attacks and sophisticated, new fraud types that are invisible to traditional detection methods.
Facebook, Google, and LinkedIn have pioneered big data and machine-learning approaches to protecting their subscribers and gaining insight into vast amounts of data. CSPs are now using these advanced big-data approaches to detect and analyze fraud. They start by combining their data silos into one vast data lake that can be tapped – using a combination of big data, Hadoop, and machine learning – to provide real-time information about anomalous behavior as it happens. When you have enough data and you have access to that data in real time, you can detect fraud in real time.