Business intelligence and analytics continue to challenge organisations of all sizes.
There is a huge disparity between the data collection capacities of today’s businesses and their analytics capabilities.
In an age when information is produced in massive amounts, some companies are drowning in spreadsheets and lack data automation altogether. Meanwhile, others have invested millions of pounds in every analytics tool available.
Despite continued investment in the area, companies have been challenged to find the right way to approach business intelligence (BI) and analytics.
In the past, BI started off with huge and expensive installations to gather data together in one place using data warehouses.
This proved useful for creating a single source of truth, but it left firms slow to react to changing business requirements.
The alternative was individuals using their own productivity tools to work with report data and then visualise it.
>See also: 5 myths about business intelligence
Over the years, the BI pendulum has swung from a centrally managed IT-based, governed approach to an approach where end users are analysing their own, siloed sources of data.
Getting to that middle ground – where data is governed and trusted, and individuals can work with it in their own ways – should be the goal.
As they embark on this brave new world of BI and analytics, organisations crave consistency.
However, many lack the right technology and processes to achieve this. The muddled nature of approaches to deploying analytics therefore sees many firms fail when implementing large scale projects.
Part of the problem centres around being able to view data in a useful way.
It is all too common for people from different departments to turn up at meetings with multiple answers to the same question.
This leaves others within the business unable to figure out the truth. At best, one trusted version of data becomes gospel. At worst, business leaders and CEOs have to make decisions, based on conflicting data, which affect the future of their businesses.
Adding to complexity, there has been an influx of data discovery tools deployed independently from IT.
Once these tools get a foothold in one department, firms then try to implement a centrally run approach to data discovery that enables them to see what’s actually going on in the whole business.
This is made more difficult by each department using big apps such as Salesforce, which can create silos within the organisation.
It means departments don’t talk to each other and, again, it leaves leadership teams trying to comprehend multiple figures covering the same thing.
>See also: The central intelligence atlas: how BI is going visual
The optimum approach is one that blends the centralised and decentralised.
As part of this, the right technology can help. After all, analytics is a means to an end: people invest in BI and analytics to have access to information that helps them make better decisions.
Networking these sources of data can help in the process.
By virtualising access to information and making it easier to combine data sources, multiple departments can take part in data analysis around the same sets of central information.
If a member of the department needs to add his/her own local data, this can be done as part of their virtual space rather than relying on central IT to add or move physical data.
The central IT team can own and manage the data, and everyone can get access to it, too.
Of course, internal politics also play a part in failed BI projects.
Overcoming this requires data silos between departments to be broken down. At the same time, it’s important to get the balance right, so each department is still in control of its own performance.
Therefore, companies need to create the right metrics that they can track over time.