For the great majority of years in the past decade, Chief Information Officers named “business intelligence and analytics” as their top focus in Gartner Inc. annual surveys of technology priorities. That set of technologies moved to number one in the survey in 2006 and stayed there until 2009. It fell to fifth in 2010 and 2011, but was back on top in 2012 and has stayed there ever since.
One might think, then, that CIOs would put a major focus on using BI and analytics in IT operations. Nope. Many IT organizations have been enthusiastic about applying analytics in various forms to marketing, HR, supply chain, and other aspects of the business, but not to themselves. IT often strains to provide basic metrics about its own operational performance or to report them in anywhere near real time. Yes, most IT organizations report on their uptime, but that’s about it.
The rest of the business is increasingly focused on moving beyond descriptive analytics/reporting to predictive and prescriptive analytics – analyses that predict the future and tell employees how to do their jobs better. And there are a variety of applications of such advanced analytics in IT operations; there is a helpful list from Gartner on Wikipedia. They include, for example, predicting when key components of the IT infrastructure might develop problems and recommending resolution strategies for system outages. And there is clearly a great opportunity to begin using analytics to predict security breaches rather than responding to them ex post facto. However, the fact that this list exists doesn’t mean that many IT organizations have actually put those applications in place. I see very few examples of predictive and prescriptive applications in IT operations.
There are many possible explanations for why IT has become the shoemaker’s children with no analytical shoes. One would be that IT is too busy helping others with analytics to work on itself. Another would be that it doesn’t think that detailed metrics and analytics would reflect well on itself. A third explanation might be that CIOs and other senior executives feel that the payoff is greater for non-IT analytics. A fourth would be that there just isn’t enough data to analyze. This latter factor seems unlikely, as computers are pretty good about generating data on their own performance.