- Big data is everywhere. Whether you’re a fan of the term or not, the impact of these messy, massive data sets proliferating at an unprecedented pace is readily apparent in the healthcare industry, where access to timely, accurate information truly is a matter of life and death.
Healthcare organizations have always had an interest in harnessing the potential of big data analytics in pursuit of new treatment strategies and better patient outcomes, and that curiosity has only intensified as EHRs, health information exchanges, and data repositories bring staggering new capabilities to informaticists and data scientists.
But with big data comes big problems. Chief amongst them is the challenge of ensuring that providers and their patients are benefiting from this deluge of information instead of being overwhelmed by it. The healthcare industry, as a whole, has not been able to overcome this barrier as quickly or effectively as it might wish. Frustration, confusion, and anger have followed federal attempts to goad providers into advanced EHR use, and true system-wide interoperability is still just a gleam in the eye of a few pioneering organizations.
Taking a fragmented, reticent, cash-strapped healthcare system by the scruff of the neck and dragging it into the modern electronic era has never been an enviable task, as CMS and the ONC know all too well. But healthcare providers don’t necessarily have to be bullied into using big data. The system just has to prove to itself that big data is an asset to be embraced, not a burden to be feared.
At the National Quality Forum (NQF) and the Peterson Center on Healthcare, making the case for big data hasn’t been the easiest proposition. While the development of standardized quality measures may seem like a step in the right direction for healthcare organizations that don’t know where to start with their data assets, those same providers seeking guidance have also complained bitterly about the administrative burdens and complexity of quality reporting programs and performance measurement schemes.
The pushback has hardly deterred NQF President and CEO Christine Cassel, MD, from continuing to advocate for more accurate, more meaningful quality reporting frameworks that can weave performance improvement and better outcomes deeply into fabric of the healthcare system. If you can’t measure it, then you can’t manage it, Cassel told HealthITAnalytics.com, and big data in healthcare is absolutely no exception.
“The core work of the National Quality Forum is to be the gold standard for measures that are used throughout the healthcare system. Everybody needs and deserves to have consistent and accurate information about healthcare,” Cassel stated. “That’s easy to say, but in a world where data sources are multiplying by leaps and bounds and where there’s a growing emphasis on tying payment to performance, there’s a lot of complexity when it comes to actually implementing systematic improvements.”
“Some of those challenges have to do with understanding big data in every way it manifests itself,” she continued. “We always have to ask ourselves how we can turn data into meaningful information, particularly for consumers and providers. You do that by creating measures that are accurate and that are consistent so that you can compare apples to apples across a community, a group of providers, or across the nation as we try to achieve the Triple Aim of better population health, better healthcare, and more affordable healthcare.”
In conjunction with the Peterson Center on Healthcare and the Gordon & Betty Moore Foundation, the NQF recently released a white paper outlining the feedback it has received from stakeholders deeply entrenched in the thorny problems of marrying big data with measurable improvement. Optimism and persistence do figure largely into the narrative, but there is a fundamental problem: not all healthcare providers perceive themselves – or even want to perceive themselves – as a piece of a larger whole.
“One of the big questions we’re facing is if we even see our healthcare system as a system,” pointed out Dr. Prabhjot Singh, Special Advisor for Strategy and Design at the Peterson Center for Healthcare. “If we do, then there might actually be ways to think about how we systematically engineer it to be higher quality. If we are trying to do that, then there ought to be whole-system measures to help us track that entire process of change, and help to illustrate how you can enact system-level improvement that is useful at the practice level.”
“We need very targeted and customized data for delivery improvement that's practical at the physician level,” he added. “So when we’re designing a practice or re-engineering a practice, what type of information will actually help change its decision-making? What information assets does it need to actually connect with other organizations and make that possible?”
Access to information assets remains a basic obstacle for many healthcare organizations, even those with certified, brand-name EHRs. To get around the problems involved with extracting, normalizing, and analyzing big data, much of which may be locked up in narrative free-text, providers have turned to a ready source of clinical and financial information: insurance claims.
“Many of the current accreditation programs and quality measures still rely on claims data, because you can use it across the board,” Cassel explained. “In order to get paid, everybody goes through the claims process, and it produces a lot of relatively standardized data.”
“However, as organizations invest more and more in electronic health records, those are becoming better sources of information for measures,” she continued. “First of all, they are real-time, so you don’t have to wait a year to get your report back from this payer or that payer. Secondly, they give you a picture of all of your patients, not just the Medicare patients or the Medicaid patients or the Aetna patients or the UnitedHealth patients, which is how many organizations are currently looking at their data, unfortunately.”
“EHR data can create a deep and rich platform for big data analytics, and we are increasingly pushing for the creation of e-measures that they can be derived from electronic medical records. There are also other kinds of systems of data that are out there, including registries that hospitals or medical specialty groups have pulled together. Those are wonderful tools for creating national benchmarks.”
But before the healthcare system can create national benchmarks, it needs national data standards and a consensus on how to deploy and use them. The industry, as a whole, may be many years away from creating seamless lakes of big data which can be dipped into, moved, shared, and extracted at a moment’s notice.
“We are very aware of all of the interoperability problems that are currently making it very difficult to get comprehensive information about a single patient, if that patient is part of more than one health system, which really includes everybody,” Cassel acknowledged. “People travel, and people have multiple providers, so if you really want to have metrics that truly measure quality or promote care coordination, you need to be working with a higher level of interoperability.”
A number of public, private, and hybrid organizations, including NQF and the Peterson Center, are doing their best to shepherd this process along. With initiatives like the CommonWell Alliance, the Argonaut Project, and Carequality garnering attention, accolades, and new members at a respectable clip, the industry is gathering momentum to make some big breakthroughs in interoperability. As they do so, the coagulation of disparate data sources will follow.
“We see a huge potential for assembling a lot of the scattered big data assets that are out there,” Singh said. “Some of those are big data; some of them are small but important. As we look at bringing this data together, there is the challenge of using it to measure improvement across a field.”
“Are we giving practitioners the right information that they need? Is it available on demand in a timely way? Are we using existing data assets that are out there to pull out peak performance, especially when it comes to high quality? What does high quality and high performance really mean? For the Peterson Center, the role of data is really central, but it's always really connected to the work of the practitioner, and improving quality for patients and consumers.”
Shifting the role of big data from an academic, intangible concern into a valuable tool for systematic improvement has not been easy. Providers, consumed with managing the swift pace of daily life in the clinic and wary of anything that will take time away from their ability to stay with patients in the consult room, have not been eager to embrace developers’ initial attempts at leveraging big data for actionable insights.
EHR alerts stemming from analytics algorithms have been cumbersome and distracting; prompts and check-boxes intended to ensure best practices have instead forced users to develop creative but detrimental workarounds. Technology developers have not quite figured out how to squeeze additional functionalities into already-cluttered EHR interfaces without producing exasperation and disaffection – and care quality may be suffering as a result.
Cassel suggests that big data developers take a cue from the consumer side of things. Passive data gathering and reporting is the key to the early success of the Internet of Things, which enables consumers to do everything from log their sleep and exercise to pay their bills to receive notices from their healthcare providers without requiring much in the way of active time or effort.
“The true meaning of ‘big data’ is data that comes as a byproduct of other kinds of interactions,” said Cassel. “I think from the provider perspective, it’s the goal to say that they don’t have to hire staff to collect and report data to all these different entities. They just want to push some buttons on the computer, and produce reports that would deliver measures in this format or that format. Most importantly, it would tell them how they’re doing, where their gaps are, and what they need to do to improve.”
“In that world, where big data is really part of the workflow, you would also have a more seamless way of getting patient reported outcomes and patient-generated health data,” she continued. “We tend to think of healthcare quality data as coming from the delivery system. There’s a good reason for that: it’s where we learn how long people are in the hospital, what medications they get, what surgeries they have, what complications they have. All of those kinds of things.”
“What we’re really missing is the patient voice. How are they feeling a month after their hip replacement? Are they recovering from their pneumonia once they leave the hospital? We don’t have as much information about that as we could,” she lamented. “In the world of apps on your phone and social media and ways to get people to interact and respond, I think I would like to see much richer availability of patient reported data over the next three to five years that could be used in a lot of these performance measures.”
Providers have expressed mixed opinions on the value of patient-generated health data to their decision-making processes, but few would argue with the fact that patient engagement and the patient perspective are quickly becoming a major part of an organization’s financial well-being, especially as value-based reimbursement programs stress the importance of consumer satisfaction and patient outcomes.
While many healthcare organizations have embraced this trend, albeit with varying degrees of enthusiasm, others are experiencing anxieties about adding patient-centered data sources to a system that is already convoluted and hard to manage. Patient-generated health data is unwieldy, noisy, and difficult to digest in the ten or fifteen minutes that most providers are lucky to have with their patients. Data for the sake of data is not always the solution, and finding a balance between analytics and experience is not an easy task.
“That is the fundamental question right now,” Singh agreed. “Do we want an impersonal, data-oriented system, or do we want a customized, person-oriented experience? And I think that that really probably speaks to the state of the field at the moment, where we end up using big data to paint the industry in broad brushstrokes of quality, whether the performance is high or low.”
“But there is a disconnection between the experience of practitioners and the experience of patients and consumers, who are not able to mobilize that information for their own feedback, for their own ability to see how any of that kind of big picture stuff impacts their lives. At the end of the day, we have to put that systemic information into the hands of providers, and then into the hands of patients and consumers so we can understand how they are using data to improve the quality of their work and the quality of the outcomes.”
The challenge of bringing that “big picture” home to providers ties in, once again, to clinical quality measures. Sending reports to centralized locations is the only way that national-level overseers like CMS can truly understand variations in care, utilization, and outcomes at the practice level, yet providers continually balk at the time-consuming nature of the task.
“Providers are struggling with the scope of reporting that they have to do, particularly those who are not part of these big systems where somebody else takes care of all that for you. I’m very sympathetic with this,” Cassel said. “I think that there is a lot of work that goes into performance reporting. We all know how important it is, and we all know the tremendous gains that have happened because we’ve been able to look at performance measures and reduce hospital-acquired conditions, for example, and improve quality and safety in so many areas.”
“We are spending a lot of time this year looking at how we can get to a place where the measures are comprehensive, meaningful, and useful to providers according to their areas of specialty, which we haven’t always done to highest level.”
“If you’re a neurosurgeon or a neurologist who specializes in multiple sclerosis, and you’re going into a value-based reimbursement program, there really aren’t any measures that accurately describe what you do,” she said. “It’s a Goldilocks problem. We have too many measures, and at the same time we don’t have enough. What we need is to get them just right, and that’s a big part of how NQF can be valuable to the physician community.”
Quality reporting doesn’t have to be a one-way street, says Singh. Providing feedback to healthcare organizations can make a major impact on the way they conduct themselves, he said, citing an example of a two-physician practice in Tennessee that had no idea it was one of the nation’s top performers before quality reporting data showed the organization just how well it was taking care of its community.
“If you let providers know that they are doing a good job, and you have the data to back that up, it changes their mindset about what they have to offer other people,” he asserted. “It makes them more introspective about what makes them a bit different. And at the same time, it allows them to feel like they’re part of a much larger group of people in the country that are trying to get better, and that they have something to contribute.”
“At the Peterson Center, we’re going into these practice settings and understanding how they work. What's their culture like? Who are their leaders? How can they help us scale those features across the country? The challenge is doing that in a way that is in sync with the practitioner's workflows. And for us, we really see that as being the opportunity and challenge to help figure that out alongside the provider community,” he added. “As providers get more practice with big data analytics and managing electronic information, I think that we will find some of the sweet spots where practitioners can engage with the data, see meaningful results, and improve their actual job satisfaction.”
The Peterson Center on Healthcare and the National Quality Forum are working together to turn the theory of big data analytics into something the healthcare system can truly embrace with open arms. While the process is fraught with obstacles and no small measure of opposition, improving the way that big data is used for quality reporting and benchmarking is the first major step towards gauging and guiding large-scale systematic improvement.
“Sometimes you end up putting a lot of effort into something which isn't acknowledged as being useful,” Singh said. “And there can be a lot of frustration if there isn't that type of connectivity and resonance that goes up and down these increasingly complex systems. Part of what we're hoping to do with the NQF is to provide practical guidance to the field, and get a lot of feedback on how we can support the leaders, investigators, and entrepreneurs, that are creating the linkages that allow data analytics to thrive at the practice level.”
“Getting from a point where things are a little bit scattered and disconnected to those more seamless, closed-loop systems of care coordination and health information exchange…it takes a lot of effort, and we’re very mindful of that,” Singh acknowledged. “It’s going to require support, investment, and encouragement to get there. But we do know that if you are able to do those things, and you're able to use data to assess whether you're doing a good job, we know that consumers and providers both really appreciate that experience.”
“It can be very data-intensive, and we're looking for innovators and thinkers that can bring big data and systematic improvements together. That’s something that we're keen to learn more about, especially from people that are doing it.”