Those who understand the collective value of the gems within Big Data understand what mining them could mean – especially when it comes to healthcare. But getting there in this arena means hopping on a different kind of train than what other industries are riding.
It’s called Big Data for a reason. Mostly, because there’s so much of it that few know how to tame its yottabyte unruliness. Yet, those who understand the collective value of the gems within understand what mining them could mean – especially when it comes to healthcare. But getting there in this arena means hopping on a different kind of train than what other industries are riding - one which requires passengers to couple their cars, rub elbows, and share.
“We’ve done a great job in the past ten years to get where we are, but I am really excited about the next decade to advance this notion to get data beyond meaningful use and advancing interoperability.” Now that MU Stage 3 is upon us, DeSalvo says it’s time to take advantage of all the Big Data fingers living outside of the EHR, and integrate the data that’s churning there into the clinical world, “setting the stage to dramatically change the way we think about standard data capture.”
And there are those in healthcare who are doing it - like Big Pharma, which is rolling up its collaborative sleeves as evidenced by the recent announcement of three major Big Data-sharing initiatives - leaving slack-jawed researchers awed by the new possibilities before them.
Project Data Sphere is getting the most attention - combining the efforts of some big names to share, integrate and analyze collective historical cancer research data in a single location via a technology platform built by that global analytics giant, SAS. With such powerhouses as AstraZeneca, Bayer, Celgene, Janssen Research and Development, Pfizer, Memorial Sloan Kettering Cancer Center committed to working together, the number of available data sets will quickly grow from the current nine to some eye-popping figures.
Another Big Pharma collaboration recently announced includes the legislative model passed by the European Parliament that is expected to take effect in 2016. In it, the clinical data created through new trials started after the law is implemented will be available for sharing.
And the third is an exciting partnership between Big Pharma and Social Media in which Genentech and PatientsLikeMe will work together, using open-source research to mine the patient-generated data of their real-life experiences with diseases and drugs to improve cancer research efforts. According to Jamie Heywood, founder and chairman of PatientsLikeMe,
“With Genentech, we can now embark on a journey to bring together many stakeholders across healthcare and collaborate with patients in a new way.”
If anyone’s getting the healthcare Big Data thing right, it appears to be Geisinger Health System, where CEO Glenn Steele Jr., MD, and other forward-thinkers are actually finding a way to extract and use the data within to create big change. Through Geisinger’s analytics, they’re finding that the “you-get-what-you-pay-for” equation doesn’t apply when it comes to patient care.
“We think there’s a definite inverse correlation between cost and quality. High cost really does correlate with low quality.”
That may not be good news to everyone associated with the healthcare industry, but it sure sounds like something that patients can benefit from – which is what this effort is supposed to be about in the first place. Steele says that increased complexity must lead to an increasingly simple mission that focuses on activation and empowerment of patients, enhancing quality and safety, and embracing a spirit of value reengineering.
That spirit of value reengineering takes us back to healthcare’s digital roots, attaching Big Data’s value to the reason for its use: to improve quality, increase access, and decrease healthcare costs.
“Attack cost or increase quality? If you reengineer, you may get a twofer,” said Steele.
By integrating the data-mining capability on the insurance side with the information on the provider side, Geisinger has been able to demonstrate improved care outcomes for a variety of chronic diseases.
However, since a recent report by HIMSS Analytics and the International Institute for Analytics (IIA) confirmed what’s already apparent – that some healthcare providers are more analytically mature than others – those who haven’t yet stepped up to the plate of Geisinger’s example shouldn’t fret, but be willing to learn from those who are leading the way. According to James E. Gaston, senior director for clinical and business intelligence at HIMSS Analytics,
“Hospitals are collecting more data – what they are doing with that data is another thing.”
A significant finding in the study was the fact that providers with the highest analytics maturity place high importance on the use of data throughout the organization. Gaston said that one was key:
“We expect it to apply across the entire organization – not just to the clinical side, and not just to the business side. Everybody needs to embrace analytics for it to be part of the culture and mature.”
There are many “turn-key” solutions touted in the technology world, but for the extraction and optimization of Big Data in healthcare, such an animal just doesn’t exist - which is why we must continually hop back on that interoperability engine.
One of the most important things we’ve learned throughout the evolution of health IT is that accessibility to essential patient information across a variety of providers and platforms increases continuity of care –which leads to better outcomes.
Within organizations and among competitors, we’ll only benefit ourselves, each other, and the patients we serve by moving beyond proprietary sentiments and climbing on the collaborative train that can get the job done.