A quick search on LinkedIn reveals thousands of professionals in the United States now hold the recently established title of Chief Analytics Officer (CAO). Analytics officers have ascended to the C-suite amongst a constellation of new roles and responsibilities that cut across departmental lines at Fortune 1,000 companies. These positions are driven by the influx of data with which companies now need to contend, across even industries that were not previously data-oriented.
The CAO’s role most closely aligns with business intelligence, leveraging data analytics to create real business value and inform strategic decisions. Yet, the CAO’s responsibilities also encompass discovering the various and constantly changing threats and opportunities impacting a business.
The most dramatic shift in data-driven business intelligence that has necessitated this role is the sheer volume, variety, and velocity of data now available to the enterprise. Data is no longer just static or historical, but real-time, streaming, unstructured, and abundant from both public and proprietary sources.
Unstructured data is the fastest growing category of data, and within it stream data – time-stamped series of records – is the fastest growing sub-category. Stream data spans messaging, social media, mobile data, CRM, sales, support, IT data, sensor and device data such as the emerging internet-of-things, and even live video and audio.
The CAO’s charge is to enable the enterprise to deal with all of this data and generate timely, actionable intelligence from it – increasingly in real-time. I’ve been calling this process of business intelligence for streaming data “stream intelligence” for a while now. Among the dozens of CAOs I’ve spoken with recently, moving from classical business intelligence on static data to stream intelligence is one of their biggest priorities for 2016. This emerging form of BI creates unique problems for enterprise companies, but it also creates unique opportunities for those companies to discern and discover trends early, while there is still time to act on them.
Understanding BI 3.0
Thomas Davenport is a professor at Babson College, a research fellow at the MIT Center for Digital Business, and a senior advisor to Deloitte Analytics. He has written eloquently about these topics since 2010 and offers a framework for thinking about the past, present, and future of analytics.
For Davenport, BI 1.0 was about traditional analytics, providing descriptive reporting from relatively small internally sourced data. It was about back-room teams and internal decision reports.
BI 2.0 was about complex, much larger unstructured data sources. It was also about new computational capabilities that ran on top of traditional analytics. With big data, we saw data scientists first emerge, alongside several waves of new data-based products and services. This is where we are today.
BI 3.0 is about rapid, agile insight delivery – analytical tools at the point of decision, and decision making at scale. Today, analytics are considered a key asset enabling strategic decision making, not merely a mirror reflecting an organization’s past and present.
The “how” of accomplishing this vision amounts to balancing support for the “three V’s” of data — volume, variety, velocity — in the enterprise analytics stack. Most big data and BI technologies to-date were engineered to solve volume and variety, with very little emphasis placed on the velocity of data and analytics. This has to change.
Analysts are already drowning in volume, variety, and velocity of data. To make matters worse, the rate at which new analysts are being trained is far less than the growth rate of demand for analysts and data scientists. In fact, the gap between the supply of analyst hours and the demand for analyst cycles is growing exponentially. This means that there will never be enough data scientists to cope with the rise of unstructured stream data in the enterprise.
To solve the growing “analyst gap” we either need to figure out how to make exponentially more analysts, or we have to figure out how to make the finite supply of analysts exponentially more productive. I prefer the latter solution, but to accomplish it, analysts need automation.
Manual analysis by humans is still possible for structured data, but not for streaming data. Streaming data is just too complex, and changes too fast for human analysts to keep up with on their own. Automation is the only practical way to keep up with changing streams of big data.
BI 3.0 emphasizes real-time business impact and makes use of automation in the analytics process. This will increasingly be achieved with a seamless blend of traditional analytics and big data. BI 3.0 analytics are now integral to running the business day-to-day and hour-to-hour.
Following the flow of big data investment
I’ll close by talking about where big data investment dollars are starting to go. In short, real-time stream data is now a major priority, and historical data is now riding in the back seat.
According to Harvard Business Review, 47 percent of expenditures are directed towards process improvement. 26 percent are directed towards accommodating a greater variety of data, and 16 percent address a greater volume of data. Velocity of data today represents a very small slice, at three percent of overall investment, but that slice will grow quickly in 2016.
In fact, organizations that have prioritized real-time data are outpacing all others, according to Aberdeen Group. Companies that are competent across volume, variety, and velocity alike have seen 26 percent growth in their pipelines, 15 percent increase in cash generated, and 67 percent in operational cost reduction.