- When it comes to healthcare big data analytics, “more” isn’t necessarily “better.” Extracting actionable insights from data is a struggle for everyone, no matter how complex, seamless, and sophisticated a health IT infrastructure appears on the surface. Big data analytics baffles the majority of providers, in part because it is so difficult to identify, collect, and utilize the right information at the right time for enacting care quality improvements.
Over the past few years, healthcare organizations have been focusing on liberating their data from siloes and lockboxes, but interoperability is only useful when the data has a purposeful place to go. For many providers, just getting access to enough data for large-scale clinical analytics or population health management seems like a big enough challenge – and they haven’t necessarily had the time or manpower to figure out where that data is going or how to use it effectively.
Organizations end up frustrated; executives may pull the plug on analytics projects that are not bearing immediate financial fruit. Clinicians complain about convoluted EHR workflows and impractical quality reporting expectations…and big data suddenly looks like nothing more than a big mess.
Why don’t providers who are swimming through petabytes of EHR information, claims data, patient-generated health data and operational statistics feel any more prepared to tackle the major challenges of patient care than those organizations still using basic Excel spreadsheets for manual reporting on utilization and outcomes? Because quantity does not equal quality, and access does not equal insight.
But healthcare providers don’t need to fall victim to untenable analytics projects, fuzzily defined goals, and systematic stress fractures, says the National Quality Forum (NQF) in a finalized white paper detailing the impact of big data analytics on improvements to patient care. The trick is to transform data into information by enacting cultural, technological, and process-driven changes to the way the industry views its greatest, most challenging asset.
“Raw data alone cannot lead to systematic improvement,” NQF asserts. “It has to be turned into meaningful information. Institutional leadership and culture have to support improvement efforts, and clinicians and healthcare staff need the skills to analyze and apply data.”
Effective systematic improvement will help healthcare organizations achieve a rare and coveted balance between the amount of available data and the actionable insights it can provide, the paper continues.
By convening stakeholders in conjunction with the Peterson Center on Healthcare and the Gordon and Betty Moore Foundation, NQF has formulated a series of recommendations for healthcare big data analytics based on what has worked for innovators and leaders across the industry.
Defining the goals of big data
“The goal of systems improvement is to reduce waste,” the white paper says simply. Poorly designed health IT systems, insufficient health data interoperability, and immature technologies contribute to workarounds that end up squandering more time than they save, especially when new processes are built upon the shaky foundations of old shortcuts.
As rising costs and higher patient volumes stress healthcare organizations to the breaking point, providers must ensure that their health IT processes are meeting rigorous standards to protect data integrity, safeguard patients, allow for proactive, preventative care, and position organizations for success with population health management, value-based reimbursements, and other emerging care strategies.
Providers can clearly outline their big data analytics goals by asking themselves the following questions:
• What specific use cases does the organization wish to address? Preventable readmissions? Access to care? Chronic disease management? Supply chain management?
• What clinical and financial data is easily accessible within the organization to help measure and improve these issues?
• What data must be gathered from outside the organization to help meet these goals? How can the organization strengthen partnerships within the community to improve health information exchange?
• Are there any health IT tools currently in place that can help to collect or analyze data? Are new tools needed to address data collection and interoperability concerns?
• How can current staff members contribute to solving these problems? What new team members are needed to round out the big data analytics skill set?
Identifying the systematic challenges
The answers to many of those questions may lead healthcare organizations into some common roadblocks which are symptoms of major systemic deficiencies. The healthcare industry is still at the beginning of its transformation into a data-driven ecosystem, and “current improvement capabilities are less than current data capabilities,” NQF says.
“The availability of data was less of a constraint than the ability to use and apply the data toward improvement,” the paper states, recapping some of the experiences of stakeholders participating in discussions about systematic improvement. “In many cases, the technological issues were relatively straightforward, albeit still difficult, while an organization’s capability to use data depended on complex factors, ranging from workforce skills to organizational culture.”
Many organizations still operating primarily under fee-for-service reimbursement models have little incentive to streamline service delivery, divert patients away from expensive emergency departments or inpatient care, and implement analytics tools that help move away from “firehose” approaches to providing care.
Providers used to billing for services wholesale are also extremely wary of new clinical quality measurement systems that gauge performance by tracking outcomes and patient satisfaction, yet collecting this data is critical for pinpointing areas of concern, standardizing evidence-based approaches to care, portioning out incentives, and developing a holistic approach to turning big data into actionable insight.
“We always have to ask ourselves how we can turn data into meaningful information, particularly for consumers and providers,” said National Quality Forum President and CEO Dr. Christine Cassel to HealthITAnalytics.com in a recent interview. “You do that by creating measures that are accurate and that are consistent so that you can compare apples to apples across a community, a group of providers, or across the nation as we try to achieve the Triple Aim of better population health, better healthcare, and more affordable healthcare.”
Making big data useful to the end users
Organizational buy-in may be one of the three foundational keys to a successful big data analytics project, but it’s also one of the most difficult things to secure. Executives generally want quick, significant financial returns on a project that is purported to bring improvement to an organization, while clinicians want to keep their focus on providing patient care, and often don’t want health IT to complicate or interfere with their central mission.
“The true meaning of ‘big data’ is data that comes as a byproduct of other kinds of interactions,” Cassel said. “I think from the provider perspective, it’s the goal to say that they don’t have to hire staff to collect and report data to all these different entities. They just want to push some buttons on the computer, and produce reports that would deliver measures in this format or that format. Most importantly, it would tell them how they’re doing, where their gaps are, and what they need to do to improve.”
Big data analytics tools must keep it simple for clinicians if they are to deliver any benefit, and they must leverage industry-wide data standards that encourage interoperability and a plug-and-play mentality. As technology developers embrace the idea of data integration and information exchange, they are opening up new opportunities for extending analytics past the walls of the clinic and into the homes of patients.
Patients are a “largely untapped data resource,” NQF says, and the meaningful integration of patient-generated health data with EHR data, claims, and socioeconomic information is the holy grail of healthcare analytics. Most organizations are not at that level just yet, but emerging consumer interest in the connected devices, wearables, monitoring tools, and patient engagement technologies that make up the Internet of Things is beginning to make a splash in population health management circles.
“Beyond providing data, people can encourage improvement by using data to select high-quality providers, collaborating with providers on improvement initiatives (such as through patient and family advisory councils), and participating in community and regional collaboratives,” the white paper adds. “Consumers can also continue to build the demand for accessible and complete data, such as open medical records (‘open notes’) or the inclusion of information about care from all providers and relevant sources in their records.”
“To help people take on a greater role, the huge volumes of ‘big data’ must be turned into meaningful information that is available at the fingertips, or at least arm’s reach, of consumers and purchasers. This is an agenda for progress that will have to involve multiple stakeholders, bringing data scientists together with data users,” which include clinicians and patients alike.
Outlining strategies for ongoing improvement
When it comes to big data, striking the balance between volume and value is an elusive and complex goal. The industry may be making progress towards the high level of health data interoperability that will enable more sophisticated analytics and tailored insights, but there remains a great deal of work to do.
The National Quality Forum and its partners are deeply involved in helping providers achieve the benefits of big data, and have developed a high-level roadmap for fostering the systematic improvement necessary to foster better patient care. In addition to bolstering the nation’s quality measurement framework, NQF makes the following suggestions for ongoing progress towards interoperability and big data success:
• Encourage both public and private payers to make data more widely available in a timely manner, especially data sources that enable population health management and large-scale measurement. A Medicaid analytics platform that integrates patient data across state systems would contribute to this goal.
• Continue to make true health data interoperability a priority for EHR vendors by eliminating data access fees, developing widely available APIs, and adhering to industry-accepted data standards for EHRs and other health IT tools.
• Develop a data-driven culture within healthcare organizations by training staff to effectively use clinical analytics tools for patient care improvements. Providers should invest in education, internal data governance strategies, and organizational change management that place an emphasis on using big data assets strategically.
• Promote patient engagement by encouraging patients to provide patient-generated health data and leave feedback on their experiences with the healthcare system. Providers and patients should collaborate on improvement initiatives by creating local or regional collaboratives to collect data on the quality of care delivered by healthcare organizations.
• Retool policies and governance frameworks to focus on the deployment of common quality metrics. Providers, developers, and rule makers should work together to improve the efficiency and efficacy of measurement structures, establish standards for common data elements, and ensure that healthcare workers receive the training and education required to engage in big data analytics and data-driven healthcare.
As more healthcare organizations develop the technologies and competencies to extract actionable meaning from their raw data stores, they will learn to balance quantity and quality while driving waste and workarounds out of the system. The industry’s early forays into healthcare big data analytics have identified myriad challenges, but have also highlighted the enormous potential for using data to reduce costs, raise quality, and tighten operational efficiencies.
By combining technology, governance, and patient-centered enhancements to the care process, healthcare organizations can be sure that they are getting the most out of their investments without drowning in the quicksand of collecting data for data’s sake.