Agile (with a big A) refers to a philosophy for software development that was originally developed in 2001. The Agile Manifesto was published to help development teams move away from large projects that missed company requirements, and instead put the focus on more incremental improvements that would add up over time.
While the principles of Agile have now entered everyday language, the world of Business Intelligence is one where large legacy projects have persisted.
However, this is starting to change. Just as Agile has gone from a specific term in software development to being an IT management buzzword, there has been a lot of effort to make BI respond faster to business requirements. Today, “agile” with a small A represents all the potential that IT and digital processes can provide in making things run faster. The growth of visual discovery tools has exposed data analysis capabilities to a larger audience, thus expanding the number of “information producers” inside an organisation, while Cloud BI platforms now represent a viable alternative to the legacy BI platforms that previously ruled the roost.
These cloud-architected platforms enable companies to be more agile by dramatically accelerating the rate of progress and shortening the time required to deliver value to the business. By eliminating many of the tasks associated with traditional on-premises deployments, Using Cloud gives organisations the ability to experiment with much less risk – “fail fast” is the popular term – and respond to business changes more quickly.
Cloud BI solutions that have a true web-scale, multi-tenant architectures are also pioneering the concept of “virtual BI spaces”, which introduce fascinating possibilities in terms of how organisations network different sources of data together. This provides decentralised line of business teams and individuals with greater autonomy, without resulting in analytical silos that lead to information chaos.
The challenge is for BI teams to change their management mindset around data. This involves becoming more “Agile” in approach when it comes to working with data, and working with the line of business teams that are asking for more data. Rather than being the gatekeepers for data and responsible for creating reports, IT can instead help those line of business teams access and build their own analytics, dashboards and reports.
Iterating on analytics and data
To make the process easier, there are three steps that can help:
- Distinguish global versus local data requirements: Some data and definitions will be common across the enterprise – financial metrics necessary for quarterly results or compensation, for example. These need to be defined and managed centrally for consistency. Other data and definitions will be specific to the local team. Responsibility for this data can therefore be held locally in the format that best fits the needs of the team.
- Empower teams to serve their own needs: End users want to make more use of data, but this can require some work beforehand to prepare it for analysis. This can take time for the central time. Helping individuals or local teams to combine and prepare data on their own, without the need for centralised data modelling, can help speed up the process and not lead to bottlenecks. This self-service approach to data can also greatly increase agility by removing the data preparation burden from IT.
- Networking BI together: Bringing local and global data together can help teams see and understand business performance faster than relying on static reports. By networking data and BI together, local teams can execute their analysis while still getting global governance support. The use of “virtual spaces” gives decentralised teams and individuals the ability to work in sandboxes that are networked with each other, thus avoiding data silos. As a result, business users can collectively create a common semantic layer, built to ensure global governance.
After this three-step process is completed, it’s important to see how the new analytics approach is performing for the team using it. Are they seeing improvements in the quality of the decisions that they are making, as they have more accurate data to make use of? Are they able to make faster decisions than before? This can then be used again as a framework to improve things further.