Do Your Big Data Analytics Measure Or Predict?

Do Your Big Data Analytics Measure Or Predict?

Do Your Big Data Analytics Measure Or Predict?

Organizations have key business processes that they are constantly trying to re-engineer. These key business processes – loan approvals, college applications, mortgage underwriting, product and component testing, credit applications, medical reviews, employee hiring, environmental testing, requests for proposals, contract bidding, etc. – go through multiple steps, usually involving multiple people with different skill sets, with a business outcome at the end (accept/reject, bid/no bid, pass/fail, retest, reapply, etc.).

And while these processes typically include “analytics” that report on how well the processes worked (process effectiveness), the analytics only provide an “after the fact” view on what happened.

Instead of using analytics to measure how well the process worked, how about using predictive and prescriptive analytics to actually direct the process at the beginning?  Instead of analytics that tell you what happened, how about creating analytics at the front of the process that predict what steps in the process are necessary and in what order?  Sometimes the most effective process is the process that you don’t need to execute, or only have to execute in part.

High-Tech Manufacturing Testing Example

We had an engagement with a high-tech manufacturer to help them to leverage analytics and the Internet of Things to optimize their 22-step product and component testing process.  Not only was there a significant amount of capital tied up with their in-process inventory, but the lengthy testing processes also created concerns about excessive and obsolete inventory in an industry where product changes happened constantly.

Read Also:
How Data Science Will Change IT Operations

The manufacturer had lots of data that was coming off of the testing processes, but the data was being used after the fact to tell them where the testing was successful or not.  Instead, our approach was to use that data to predict what tests needed to be run for which components coming from which of the suppliers manufacturing facilities.  Instead of measuring what happened and identifying waste and inefficiencies after the fact, the manufacturer wanted to predict the likely quality of the component (given the extensive amount of data that they could be capturing but today was hitting the manufacturing and testing floors) and identify what tests where needed given that particular situation.  Think dynamic or even smart testing.

We worked with the client to identify all the data that was coming out of all the different testing processes.  We discovered that nearly 90% of the potential data was just “hitting the floor” because the organization did not have a method for capturing and subsequently analyzing this data (most of which was either very detailed log files, or comments and notes being generated by the testers, engineers and technicians during the testing processes).  At a conceptual level, their processing looked like Figure 1 with traditional dashboards and reports that answered the basic operational questions about how the process was working (see Figure 1).

Read Also:
Internet of Things: Opportunities for the pharma and health care industries

However, we wanted to transform the role of analytics to not just reporting how the processes was working, but we wanted to employ predictive analytics to create a score for each component and then use prescriptive analytics to recommend what tests had to be run and in what order given the results of the predictive score.

We used a technique called the “By Analysis” to brainstorm the “variables and metrics that might be better predictors of testing performance”.  For example, when examining components with a high failure rate, we started the brainstorming process with the following question:
“Show me the percentage of component failures by…”

We asked the workshop participants to brainstorm the “by” variables and metrics that we might want to test.



HR & Workforce Analytics Summit 2017 San Francisco

19
Jun
2017
HR & Workforce Analytics Summit 2017 San Francisco

$200 off with code DATA200

Read Also:
What Does the Cloud Mean for Big Data Analytics?
Read Also:
A digital revolution in health care is speeding up

M.I.E. SUMMIT BERLIN 2017

20
Jun
2017
M.I.E. SUMMIT BERLIN 2017

15% off with code 7databe

Read Also:
Data Frankenstein: Bringing Old Business Data Back to Life

Sentiment Analysis Symposium

27
Jun
2017
Sentiment Analysis Symposium

15% off with code 7WDATA

Read Also:
Artificial Intelligence: Expansion, Adoption and the Value for Business Intelligence

Data Analytics and Behavioural Science Applied to Retail and Consumer Markets

28
Jun
2017
Data Analytics and Behavioural Science Applied to Retail and Consumer Markets

15% off with code 7WDATA

Read Also:
Data Leverage: How CIOs Are Leading Cultural Change

AI, Machine Learning and Sentiment Analysis Applied to Finance

28
Jun
2017
AI, Machine Learning and Sentiment Analysis Applied to Finance

15% off with code 7WDATA

Read Also:
Brilliant Future or Slow Decline? Digitization Changes the CIO Role

Leave a Reply

Your email address will not be published. Required fields are marked *