Healthcare Analytics, Population Health Management, Healthcare Big Data

Internet of Things Tops 2015 Healthcare Big Data Analytics Trends

2015 has been a busy year in healthcare big data analytics. What are some of the top trends we've seen over the past twelve months?

Even in a decade filled with milestone events for the healthcare industry, 2015 must certainly stand out as one of the most eventful years for the big data analytics world. 

Healthcare big data analytics and the Internet of Things

From the raging controversy over the timing of Stage 3 meaningful use, the less-dire-than-expected ICD-10 conversion, and the increasing pressure on vendors to make EHR interoperability a priority to the advent of the Internet of Things and a system-wide push to put population health management into action, the past twelve months have been packed with progress, change, and even a few disappointing setbacks.

As the year comes to a close and the holidays give providers the chance to reflect on their challenges and choices, HealthITAnalytics.com looks back at some of the biggest trends of 2015, and how they are helping to craft the healthcare industry of the future.

What were the top five most popular issues that defined the past year, and why are they so important for continuing to build the nation’s learning health system into a data-driven powerhouse of quality patient care?

The Internet of Things hits its stride

Patient engagement has always been a high priority for healthcare reform advocates, but few technologies have been able to personalize the care experience like the new crop of wearable devices, smartphone apps, and home monitors that have been made available to consumers in the past year.

The Internet of Things (IoT), a loose conglomeration of personal tech and provider-centric data analytics tools, promises to keep patients connected to their caretakers in unprecedented ways.  IoT devices have the potential to improve patient safety, make chronic disease management simpler, and provide healthcare organizations with the detailed data they need to engage in effective population health management programs.

While the associated influx of data may seem like trouble to overwhelmed clinicians still struggling to wrestle their EHRs into submission, IoT developers are putting plenty of effort into designing interfaces that ease the burden of sifting through reams of data from sleep trackers, diet apps, heart monitors, and smartwatches.

The visionary technologies and lofty goals of the Internet of Things are just getting started, but providers with their eye on 2016 should try to prepare themselves for the infrastructure requirements and big data analytics competencies that they will need to meet patient expectations and triumph in a tightly woven technological world.

Precision medicine moves from theory to therapy

Precision medicine started off with a bang in January, when it took on a starring role in President Obama’s State of the Union Address, and it has only continued to rise from there.  In the middle of the year, personalized medicine found itself in the spotlight again as Congress tackled the 21st Century Cures Act, hoping to bolster the nation’s research into rare and deadly diseases.

What was once an expensive, time-consuming theory is now becoming a matter of everyday practice for researchers and clinicians as next-generation techniques for genome sequencing bring down costs and open new possibilities.

Characterized as a “Zen-like journey of becoming” by David Delaney, MD, Chief Medical Officer and leader of SAP’s US healthcare division, the process of “pushing back the walls of science” isn’t an easy one. 

Healthcare’s big data analytics competencies must keep pace with research efforts that require huge numbers of patients for large-scale cohorts that produce meaningful results.  Currently, just three percent of cancer patients contribute to clinical trials and other research programs, says the American Society for Clinical Oncology (ASCO), but precision medicine demands much broader involvement.

In 2016, a long list of public and privately funded initiatives are planning to address these data shortages.  If the National Institutes of Health receives the right funding, it will be able to begin work on a million-patient databank and a number of complementary programs, while the ASCO’s CancerLinq project for oncologists is already bringing precision clinical decision support to providers across the nation.

Academic research institutions and medical centers are also investing heavily in tailored treatments, bringing a spirit of eager exploration and cross-industry collaboration to the process of addressing cancer, Alzheimer’s disease, Parkinson’s disease, rare genetic mutations, and other conditions that affect millions of patients every year.

“We need all sectors to work together,” wrote White House Chief Data Scientist DJ Patil and Stephanie Devaney, Project Manager of the Precision Medicine Initiative, in a blog post this August.

“We need people to actively engage in research and voluntarily choose to share their data with responsible researchers who are working to understand health and disease. We need healthcare providers to share their insight and help translate new findings into better care. And we need a strong, secure, and nimble infrastructure for health data that protects privacy, ensures security and facilitates new research models.”

Population health management continues to evolve

Precision medicine dissects the tiniest of individual traits to find personalized therapies, but providers have to manage patients on the other side of the size spectrum, as well. 

Population health management has been a hot topic for several years, as value-based reimbursement gains mainstream steam, and 2015 has been no exception.  Despite lingering concerns over set-up and spending, the accountable care organization (ACO) and patient-centered medical home (PCMH) have become bastions of the coordinated care movement.

Transparency, adaptability, health IT adoption, and big data analytics tools are the keys to success for accountable care organizations, while a commitment to excellence and continued improvement also bring savings and better outcomes to patient-centered medical homes.

As healthcare organizations sign up for risky accountable care contracts, they need to have a firm foundation in the basics of population health, including an attention to detail, a robust patient outreach program, and the right analytics tools to effectively stratify patients and identify risks.

“Organizations that do this well have very clear, well-defined care plans,” said Wendy Vincent, Director of Advisory Services at audit and consulting firm KPMG. They know exactly what needs to happen, and they execute those strategies efficiently.  They also have contingency plans.  When they see a patient heading south, they have controls in place to catch that and get the person back on track.  That is very important.”

Interoperability demands reach a fever pitch

If 2013 was the “Year of EHR Replacement” and 2014 was the “Year of Meaningful Use,” 2015 could easily be called the “Year of Interoperability.”

The increasing focus on big data analytics to support ongoing financial reform spurred thousands of providers and other stakeholders to start getting very vocal about the crippling shortfalls of their data siloes and incompatible health IT products, and vendors scrambled to respond. 

After a series of high-profile Congressional hearings roasted vendors for supposed information blocking, garden walls have come tumbling down and API tools have caught FHIR as providers, voting with their wallets, move away from proprietary systems and towards open, standards-based products. 

EHR interoperability is a core requirement for participation in Stage 3 meaningful use, which faces its own unhappy torches-and-pitchforks protests, and providers have repeatedly complained that they simply do not have the tools to make meaningful use work the way it is supposed to.

As private industry initiatives like Carequality and the CommonWell Health Alliance pick up speed and the ONC starts to work through its interoperability roadmap, the industry is primed to turn a corner on health information exchange and data fluidity as it tackles more and more big data obstacles in 2016.

Moving beyond the mandates and into big data-driven care

Whether or not Stage 3 meaningful use goes forward as planned in 2017 and 2018, federal mandates of one sort or another are here to stay.  Healthcare providers must meet the reporting and quality requirements of these programs, but they also have to find a way to keep their focus on the patient.

Big data has the potential to help healthcare organizations balance their competing initiatives and equip providers with the tools they need to leverage their EHRs for better care rather than feel overwhelmed by them.  The trick is to understand how big data can be valuable, how it intersects with mandated reporting requirements, and why it’s so important to improve the way the healthcare system measures its progress and evaluates its weaknesses.

“Providers are struggling with the scope of reporting that they have to do, particularly those who are not part of these big systems where somebody else takes care of all that for you,” said National Quality Forum (NQF) President and CEO Christine Cassel, MD.  “I think that there is a lot of work that goes into performance reporting.”

“We all know how important it is, and we all know the tremendous gains that have happened because we’ve been able to look at performance measures and reduce hospital-acquired conditions, for example, and improve quality and safety in so many areas.”

“We always have to ask ourselves how we can turn data into meaningful information, particularly for consumers and providers,” she continued.  “You do that by creating measures that are accurate and that are consistent so that you can compare apples to apples across a community, a group of providers, or across the nation as we try to achieve the Triple Aim of better population health, better healthcare, and more affordable healthcare.”

As healthcare providers continue to develop these competencies in 2016, they are likely to gain a better understanding of how they fit into the nationwide healthcare ecosystem and how they can make bigger strides towards patient safety and quality improvements while getting to grips with new payment reforms that stress value over volume.

Healthcare big data analytics, bolstered by uniform reporting requirements, will help providers “know value when we see it,” says Cassel, strengthening providers’ ability to reduce unnecessary variations, save money, and manage populations while excelling in the delivery of safe and effective patient care.

X

Join 25,000 of your peers

Register for free to get access to all our articles, webcasts, white papers and exclusive interviews.

Our privacy policy

no, thanks