Data volumes are exploding. More data has been created in the past two years than in the entire previous history of the human race.
By the year 2020, about 1.7 megabytes of new information will be created every second for every human being on the planet.
By then, our accumulated digital universe of data will grow from 4.4 zettabytes today to around 44 zettabytes, or 44 trillion gigabytes – a ten-fold increase in just four years.
Big data is also helping to make the world a better place, and there’s no better example than the uses being found for it in healthcare.
With the world’s population increasing and everyone living longer, models of treatment delivery are rapidly changing, and many of the decisions behind those changes are being driven by data.
The drive now is to understand as much about a patient as possible, as early in their life as possible – hopefully picking up warning signs of serious illness at an early enough stage that treatment is far more simple (and less expensive) than if it had not been spotted until later.
While big data is positively driving advances in healthcare, the storage and management of it are causing significant issues for IT managers – both because of the need to store, archive and preserve large volumes of data for future research, as well as dealing with the security and compliance challenges associated with it.
So what’s the best way for healthcare organisations to go about storing and archiving these huge volumes of big data safely, secularly and cost-effectively?
Pathology is an example of an area in the NHS that is undergoing disruptive change and where new digital processes are introducing challenges to old ways of working.