Access to long-term archived information about past Ebola outbreaks would have enhanced the world’s ability to track and mitigate the effects of the disease.
Christian Heiter is chief technology officer at Hitachi Data Systems Federal.
The current outbreak of the Ebola virus in West Africa and across the world is proving to be one of the largest and most frightening health challenges of our time.
The outbreak is the largest to date, with over 10,000 cases in Sierra Leone, Liberia, Senegal, Nigeria, Spain and even the United States. To track and predict the movements of this disease, public health specialists and epidemiologists are turning to some remarkable uses of technology to be able to process, store and analyze the incredible amount of data being produced.
Using maps, algorithms, mobile applications, websites, sensors and electronic health records, authorities are assembling and analyzing pools of data from across the world to understand the patterns and predict where the disease will strike next.
While both nations and agencies struggle to control and mitigate the outbreak, there have been small measures of success to report.
A recently deployed mapping application released by the National Geospatial Intelligence Agency allows relief agencies, disease specialists and governments to access information on Ebola outbreaks, medical treatment centers, electrical grids, water supplies and other infrastructure that must be in place to contain the spread of the virus.
On the ground, the Centers for Disease Control is using an Epi Info Viral Hemorrhagic Fever application, an open source program that identifies those exposed to the virus and creates a large database of patient information that includes names, gender, ages, locations, medical history and many other identifiers.
In a remarkable application of big data analytics, Swedish firm Flowminder used 2013 cell phone records in Senegal to overlay past outbreaks of the virus on traffic patterns to predict the growth and movement of Ebola within the country.
Data Key to Health Care Innovation
These innovations are all being deployed in the context of a rapidly changing digital health care environment in the United States, in which providers are being steered toward electronic health records and big data analytics.
While much of the current Ebola response is rooted in physical infrastructure and operations, it is clear the response is augmented by the ability to leverage data.
Our most valuable asset in a health care context is data.
The evolution of epidemiology will depend on successfully leveraging existing and new medical data, often in disparate formats. When the next health crisis emerges, agencies like CDC, the Federal Emergency Management Agency, the Department of Health and Human Services and other health care agencies must be able to not only access data in real time as it’s being produced, but existing records of historical data to keep the population safe and respond to potential outbreaks.
Government Needs New Long-Term Data Storage Strategy
But this dress rehearsal for how our government can respond to an epidemic using the power of information comes at a time when federal agencies are already struggling to cope with storing and managing exabytes of data.
In fact, IDC now forecasts that the world will generate 40 zettabytes of data by 2020. Furthermore, agencies are asked to store their critical data for a longer amount of time, in some cases indefinitely, which only compounds this issue.
The bottom line is, to manage this data efficiently and cost effectively, the government needs a new long-term data storage strategy.
There is no question access to long-term archived information about past Ebola outbreaks and other historical epidemics would have significantly enhanced the world’s ability to track and mitigate the effects of the disease.
Is Optical Data Storage the Answer?
To ensure data like that being used to quell the spread of Ebola is available for insight and analysis for the next 1,000 years, agencies should invest in optical storage platforms.
One key differentiator is that optical storage media are reliable, durable, energy efficient and inexpensive. Perhaps most important, optical discs retain data in its original form with minimal degradation. Storing mission data on optical mediums mitigates the risk of data loss, whether through migration or degradation, and significantly drives down the price of archiving.
As we see more epidemics and threats occur around the world, agencies should have an optical data storage strategy in mind. As was demonstrated by Flowminder, data collected from the current outbreak can be used to assist decision-makers now, but also in future crises.
What we learn from this treasure trove of data will help virologists and epidemiologists make incredible advancements in how we understand diseases for the foreseeable future, but that information must first be preserved safely.
NEXT STORY Why Big Data Needs ‘Dummy Data’