The health IT landscape often seems starkly divided between two opposing camps: those who design elegant algorithms and regimented workflows in the idealized sterility of the R&D department, and those whose frenzied, frustrating daily experience in the hospital, office, or clinic have taught them that there is no such thing as an out-of-the-box solution to any of their big data needs.
This gulf between theoretical data science and the harsh realities of clinical practice are at the root of the healthcare system’s ongoing health IT usability problems, which have only compounded in recent years as new regulatory programs and the demands of value-based care ramp up providers’ critical need for actionable insights and intuitive documentation tools at the point of care.
As pay-for-performance reimbursement structures pressure providers to engage in shrewder decision-making about the costs and necessities of common procedures, healthcare organizations have started to scoop up big data analytics tools that promise meaningful clinical decision support (CDS), predictive analytics, and population health management capabilities.
But many of these tools have quickly fallen short of those promises, argues a team of data scientists and researchers from MIT, leaving providers feeling more than a little cheated by the health IT revolution.
In order to truly extract value from healthcare analytics, the industry must use emerging strategies, including research incentives and better education, to overcome the chasm between theory and practice.
“Medicine has clumsily entered its digital age via the back door,” the team says in an article published last month in the Journal of Medical Internet Research. “There is a persistent gap between the clinicians required to understand the clinical relevance of the data and the data scientists who are critical to extracting useable information from the increasing amount of health care data that are being generated.”
“Practitioners continue to make determinations in a technically unsupported and unmonitored manner due to a lack of high-quality evidence or tools to support most day-to-day decisions, and as a result the rate of diagnostic errors by individual practitioners is unacceptably high.”
Since the beginning of the EHR Incentive Programs, federal rule makers have made it a top priority to reduce those variations in care by implementing a layer of data-driven technologies to help discover, address, and eliminate quality and performance issues.
Opinions vary as to whether or not electronic health records and digital documentation requirements have helped or hindered this goal, but there is little question that a secondary layer of complementary technologies, such as CDS systems, integrated risk scoring tools, and patient management systems are often required to supplement the basic capabilities of the EHR.
Increasingly, these systems are driven by machine learning algorithms that use previous data inputs and past results to refine future recommendations, allowing clinicians to leverage mistakes and successes to improve the accuracy and delivery of patient care.
These strategies have started to find favor with designers and clinicians exploring complex and multi-faceted use cases in patient care, including chronic disease management, imaging analytics, hospital readmissions, and sepsis. But many providers are either hesitant to accept yet another step in their convoluted workflows, or do not believe that the investment required in such cutting-edge tools will produce a sufficient return.
Creating and implementing machine learning tools that bring value to clinicians and patients without overwhelming the workflow will require industry-wide collaboration, a broader understanding of how to apply data science to clinical practice, and enhanced opportunities to build knowledge and engage in research about the role and adoption of big data analytics, the researchers assert.