Artificial intelligence and machine learning are predicted to be part of the next industrial revolution and could help business and industry save billions of dollars by the next decade.
The tech giants Google, Facebook, Apple, IBM and others are applying artificial intelligence to all sorts of data.
Machine learning methods are being used in areas such as translating language almost in real time, and even to identify images of cats on the internet.
So why haven’t we seen artificial intelligence used to the same extent in healthcare?
Radiologists still rely on visual inspection of magnetic resonance imaging (MRI) or X-ray scans – although IBM and others are working on this issue – and doctors have no access to AI for guiding and supporting their diagnoses.
Machine learning technologies have been around for decades, and a relatively recent technique called deep learning keeps pushing the limit of what machines can do. Deep learning networks comprise neuron-like units into hierarchical layers, which can recognise patterns in data.
This is done by iteratively presenting data along with the correct answer to the network until its internal parameters, the weights linking the artificial neurons, are optimised. If the training data capture the variability of the real-world, the network is able to generalise well and provide the correct answer when presented with unseen data.
So the learning stage requires very large data sets of cases along with the corresponding answers. Millions of records, and billions of computations are needed to update the network parameters, often done on a supercomputer for days or weeks.
Here lies the problems with healthcare: data sets are not yet big enough and the correct answers to be learned are often ambiguous or even unknown.
The functions of the human body, its anatomy and variability, are very complex. The complexity is even greater because diseases are often triggered or modulated by genetic background, which is unique to each individual and so hard to be trained on.
Adding to this, specific challenges to medical data exist. These include the difficulty to measure precisely and accurately any biological processes introducing unwanted variations.
Other challenges include the presence of multiple diseases (co-morbidity) in a patient, which can often confound predictions. Lifestyle and environmental factors also play important roles but are seldom available.
The result is that medical data sets need to be extremely large.
This is being addressed across the world with increasingly large research initiatives. Examples include Biobank in the United Kingdom, which aims to scan 100,000 participants.
Others include the Alzheimer’s Disease Neuroimaging Initiative (ADNI) in the United States and the Australian Imaging, Biomarkers and Lifestyle Study of Ageing (AIBL), tracking more than a thousand subjects over a decade.
Government initiatives are also emerging such as the American Cancer Moonshot program. The aim is to “build a national cancer data ecosystem” so researchers, clinicians and patients can contribute data with the aim to “facilitate efficient data analysis”. Similarly, the Australian Genomics Health Alliance aims at pooling and sharing genomic information.
Eventually the electronic medical record systems that are being deployed across the world should provide extensive high quality data sets. Beyond the expected gain in efficiency, the potential to mine population wide clinical data using machine learning is tremendous.
Data Innovation Summit 2017
30% off with code 7wData
Big Data Innovation Summit London
$200 off with code DATA200
Enterprise Data World 2017
$200 off with code 7WDATA
Data Visualisation Summit San Francisco
$200 off with code DATA200
Chief Analytics Officer Europe
15% off with code 7WDCAO17