- When it comes to developing a big data analytics program, healthcare organizations often can’t help but focus more on the results they hope to achieve than the long, arduous, complicated process of climbing the ladder to get there.
Fewer readmissions, more proactive population health management, improved care coordination, and a measurable increase in revenue are all exciting and admirable goals, and they certainly can be attained by using health IT to extract actionable insights that drive widespread change.
But crafting the underlying infrastructure to support operational and clinical improvements can be a difficult – and perhaps a little underappreciated – first step for success. Setting up a data warehouse, hammering home the importance of data integrity, and fine-tuning the reporting process might not be the most sensational pieces of the big data puzzle, but they are among the most critical.
At Nebraska Medicine, an academic medical center and integrated delivery network that specializes in complex tertiary care, Brian Lancaster is overseeing the careful planning and execution required to build a big data analytics program that will serve the network in good stead for many years to come.
As Executive Director of Information Management, one of Lancaster’s main tasks is architecting a system that reduces unnecessary data siloes, delivers reports in a timely and intuitive manner, and prepares the health system for the surge of inevitable of regulatory and financial challenges on the way.
Success starts with the basics, Lancaster told HealthITAnalytics.com, and that includes a major focus on strategic planning, careful technology choices, and an intelligent approach to the fundamentals of big data governance.
“Our first task was to establish what our vision, mission, and strategy would be in the data warehousing and analytical space,” he said. “We have a goal to make sure that every decision, whether it’s a business decision or a clinical one, is driven by good data.”
“That gives my data warehousing department a mission to create those actionable insights that are required to succeed with future business models, improve outcomes, and ultimately reduce the time-to-value proposition of analytics initiatives.”
Like many organizations, Nebraska Medicine isn’t starting from scratch when it comes to data analytics. The health system has had a number of data-driven applications for some time, but they weren’t functioning as seamlessly as executive leaders could wish.
“Historically, we have been application-centric on our reporting side,” Lancaster explained. “Most of our reports came out from each individual transactional system, because we had a lot of siloes. That created a very individualized perspective from our reports, not a view of the enterprise as a whole.”
Data integrity was further complicating the issue, he continued, because individual departments were each focused on their own applications.
“We really had no overarching data governance plans in place,” Lancaster acknowledged, but the organization quickly realized that it needed to change that.
“The recognition that we needed a method and a process to get to the point of delivering value is really an acknowledgement of the enterprise’s need to evolve,” he said. “If we want to get to self-service data analytics, we need to have data governance rules in place and we need to present the data in a way that users can understand.”
“It’s not just about delivering a report and saying that we sent it out. It’s about helping our business and clinical staff use data to make decisions that drive us forward. That starts with accurate and curated information, which requires a strong governance plan.”
Data governance can be a tricky topic for healthcare organizations, especially for those that are hoping to use the data they already have to make forward-looking decisions. A robust data governance plan is required to cultivate a trusted relationship between users and the information they are using for decision-making – and it is also the first step for integrating big data analytics into the everyday workflow.
“As an organization, we need to build our data competencies and establish best practices about how to visualize data, how to use data to optimize processes, and how to determine the right data solution for the right use case,” Lancaster stated.
Organizations that can’t afford to rip and replace every single one of their applications may wish to follow in Nebraska Medical’s footsteps by developing a multi-stage analytics process that applies the right metadata and the right logic to untouched data in order to transform it into standardized, useful elements for larger-scale reporting.
“Metadata is always important, and I think a lot of people who are new to the space don’t realize how much logic has to be applied to transactional applications with raw information,” Lancaster said. “So then when they try to perform analytics, they realize that the lowest level of data isn’t useful because they didn’t apply the right rules to it from the beginning.”
The organization uses a staging database to store the original datasets, so that the analytics team can go back to that first copy of raw material without having to extract it from the EHR or other data source a second time. If something goes awry down the line, or the team simply wishes to perform a different analytics task using the same starting point, the raw information is available in the staging area.
“From that staging database – depending on the use case and what we’re trying to accomplish – we’re going to put it into a traditional data mart where it can be transformed and manipulated,” Lancaster continued.
“We will use a runtime routine to apply any business logic or metadata to it, and then we’ll store it in that normalized form. So that is the data that we will use for analytics modeling, and that’s when we’re load it into fact tables and dimension tables for analytics and eventually expose it to the analytics end-user.”
Healthcare providers have several technology choices when setting up this initial database infrastructure. Traditional relational databases offer robust reporting capabilities, but more advanced semantic computing architectures open up the possibility of natural language query capabilities and sophisticated predictive analytics.
Weighing innovation against reliability is not an easy task for healthcare providers looking to prepare themselves for an uncertain future, but Nebraska Medicine looked back to its strategic plan to figure out the best options for their current and future needs.
“If we went with a data lake, which is a fascinating and interesting technology, we might not actually be able to serve some basic needs,” Lancaster pointed out. “So we had to think about our five-year plan. How are we going to provide for the enterprise’s fundamental requirements, but how do we do that in a way that starts to build towards the future?”
The organization settled on building a hybrid environment, which brings the best of both worlds to its big data program.
“We have a formal data mart structure that serves us well from a basic reporting standpoint, but we also thought about serving that up with a parallel warehouse that leverages a data lake approach, which allows us to do real-time analytics and some natural language processing,” said Lancaster.
“The challenge was to plumb it in the right way to allow us to get those key deliverables in weekly iterations but also go from basic reporting and query and analysis to hybrid transactional analytics processing, which is getting a lot of buzz today. It’s a great big data technique, but there are also a lot of shortcomings with putting all your eggs in that basket. So it was best for us to take a dual approach.”
That balanced methodology may be critical for meeting the upcoming challenges of value-based care and the regulatory programs that are likely to push organizations into greater adoption of big data analytics tools.
While most of its big data reports are currently targeted towards the senior leadership team through an executive dashboard, Nebraska Medicine is starting to explore how to best bring quality metric data to clinicians.
“We are enabling several clinical users to access quality measure data for their patients in support of meaningful use and PQRS,” Lancaster said. “That will really help us lay the foundation for MACRA and MIPS and whatever other new programs might come in the future.”
Eventually, Lancaster hopes that the health system will be able to develop quality reports and clinical decision support that can be used across the organization. “We are a teaching hospital, and eventually it would be a natural evolution to develop analytics that could be used directly for that purpose,” he said.
“However, right now we’re in catch-up mode with our analytics plans, because the data warehouse is relatively new to the enterprise. We only went live with it around the beginning of this calendar year. So at the moment, we are applying our efforts to bring insights to administrative and clinical use cases that we need to conquer in order to operate efficiently.”
The organization is still settling into its electronic health record, too. Nebraska Medicine spent two years and $87 million to implement Epic in 2012, and continues to optimize the technology to meet its evolving analytics needs.
“Being able to prove the value of a widely adopted EHR is always challenging,” Lancaster said. “Up until recently, there were concerns that no one ever got anything useful out of Epic in terms of dashboard and reports.”
“That’s one of the reasons why our leadership set a goal of attaining the HIMSS Stage 7 award. The point of EMRAM is to prove that not only are we getting useful information out of the system, but that we can do it in a way that’s actually impacting outcomes.”
The healthcare system achieved Stage 7 recognition in February of 2016, marking a milestone transition from technology adoption to truly impactful health IT use.
“Levels zero through six are about adopting technology, but the highest level of the EMRAM is about really using those tools for the greater good, which is the reason why you bought them in the first place,” said Lancaster. “I think that goal was a huge pivot point for us, and a key part of that was our data warehousing strategy. Epic is the key part of our strategy and has allowed us to move very quickly.”
The next step on the journey is expanding the organization’s familiarity with the big data analytics tools available within Epic, and leveraging those capabilities to drive change across the spectrum.
“Right now, we’re starting to take full advantage of data sets that are external to Epic alongside those Epic clinical data sets,” Lancaster explained. “We’re looking at patient satisfaction metrics and cost information so we can generate a clear view of the data that we didn’t have before, because we were so locked into that siloed approach of having each individual application serving each piece of information.”
The transition from fragmented reporting to enterprise-wide transparency wouldn’t be possible without the careful planning that took place before the organizations started to move forward with its big data roadmap, he added.
“Start with a clear vision of what you want to accomplish, but don’t try to boil the ocean,” Lancaster advised. “You need to lay out coordinated, actionable steps that will help you complete tasks to build confidence and earn the trust of your leaders, especially if you want to keep up your funding. So paint the big picture, and give your executives an idea of where you want to end up, but make sure you are delivering results at regular intervals to continue creating excitement and awareness.”
Taking a measured approach to the analytics environment will allow organizations to “fail fast,” he said, which is a much more attractive competency than it sounds.
“Failing fast means you’re not taking on these huge, massive projects that will just slowly crumble without anyone realizing it until it’s too late,” he explained.
“You can pivot quickly if your smaller programs don’t go as planned, and you won’t waste as much time or as many resources. In my experiences, a lot of people think they know what they want from an analytical perspective, but until they dive into it and see what’s really possible, their use cases don’t always match reality. So you need to be agile in terms of meeting your goals.”
That agility, paired with a strong team of big data analytics experts with a track record of success, will prepare providers for the growing number of clinical and financial unknowns every organization will have to address, Lancaster stated.
“It’s important to find people who are willing to roll up their sleeves and get into the details,” he said. “A lot of this data is extremely complex, and the healthcare industry needs people who understand it. You have to have the expertise to address data integrity issues and quality issues through technical approaches. Without the data science, it’s very difficult to get results.”