Join IoT Central | Join our LinkedIn Group | Post on IoT Central

Hospitals and medical centers have more to gain from big data analytics than perhaps any other industry. But as data sets continue to grow, healthcare facilities are discovering that success in data analytics has more to do with storage methods than with analysis software or techniques. Traditional data silos are hindering the progress of big data in the healthcare industry, and as terabytes turn into petabytes, the most successful hospitals are the ones that are coming up with new solutions for storage and access challenges.



Reducing Reliance on Data Silos is Crucial for True Data Integration


Big data in any healthcare application is only as efficient as the technical foundation on which it is built. The healthcare industry is learning that data silos get in the way of analytical efforts. The ability to integrate independent, far-flung information sets is the backbone of successful analytics.


When operational data, clinical data and financial data, for example, are segregated in isolated silos, the information in each category remains segmented and dislocated. When these data are integrated, however, useful analysis be conducted. Only then can analysts discover opportunities for cost reduction, locate gaps in patient care and find out where resources could be better utilized.


To get a feel for just how scattered different pieces of the data puzzle can become, some health-care operations are considering incorporating patient consumer data, such as credit-card purchases, to gain a more complete picture of a patient's lifestyle and choices. When these data are compartmentalized in silos, integration is impossible. The organizations that get the best results are the organizations that find a way to get their information out of silos and merge different sets into one fluid, cohesive network.


Data Triage and Tiering: Beyond Cloud Storage


But if organizations shouldn't store data in silos, what are their other options?


The obvious solution is cloud storage, which is an excellent choice for operations that measure data in terabytes, such as a marketing agency that monitors metrics for Gospaces landing pages. Smaller organizations or facilities that are just entering the data-storage ecosystem are naturally drawn to cloud storage, and for good reason — it gets their data out of silos and takes pressure off their own infrastructure.


But consider the case of Intermountain, a chain of 22 hospitals in Salt Lake City. With 4.7 petabytes of data under its management, cloud storage becomes cost prohibitive. The network estimates the hospital chain's data will grow by 25-30 percent each year until it reaches 15 petabytes five years from now.


With such massive data needs, Intermountain found ways to cut costs and streamline efficiency. One way was through data tiering, which is the creation of data storage tiers that can be accessed at the appropriate speeds. Tiering is currently done manually through triaging, but several different organizations are exploring auto-tiering, which automatically stores data according to availability needs.



From wearables to records sharing, big data is having more of an impact on the healthcare industry than perhaps any other field. But hospitals and medical centers face barriers to storage and access in the face of ever-growing data sets that so far have proven difficult to overcome — but not impossible. Large health networks are using advanced techniques like data triage and data tiering to deal with growing volume — all while reducing costs.

Originally posted on Data Science Central

Follow us @IoTCtrl | Join our Community

Premier Sponsors