- Texas may be famous for its unquenchably independent spirit, but thousands of healthcare providers in the northern part of the state are starting to recognize that when it comes to population health management and big data analytics, the path towards improving care quality and reducing costs requires a collaborative approach.
At the North Texas Regional Extension Center (NTREC), which covers some of the region’s most populous communities, Executive Director Richard C. Howe, PhD, FHIMSS and Theresa Mendoza, Director of Quality, BI, and Data Services at the Dallas Fort Worth Hospital Council Education and Research Foundation, are leading the vast majority of Dallas-Fort Worth providers into an era of detailed, robust population health management by leveraging some sophisticated big data analytics tools that can only function with widespread buy-in and commitment from the healthcare community.
“We have approximately 95 percent of the healthcare marketplace here in the north Texas region participating in our data collaborative,” said Mendoza in an interview with HealthITAnalytics.com. “They agree to share patient data at a certain level. Because we are the ones who de-identify it, we have been able to implement a regional enterprise master patient index (REMPI) over the whole Dallas-Fort Worth Metroplex.”
“That has allowed us to do some unique analysis around ED utilization and some community health issues around diabetes. Now that we have post-acute care skilled nursing facilities and possibly some long term care facilities coming on, too, we have the opportunity to work on improving transitions of care for patients, especially with us also working on analytics from the physician side.”
Founded in 2010 as part of the nationwide Regional Extension Center network, NTREC grew out of an existing collaboration called the Dallas Fort Worth Hospital Council Education and Research Foundation (DFWHCF), which has been operating in the region for nearly half a century with close to ninety hospitals on its membership rolls.
These deep community roots have enabled NTREC to engage in inpatient claims analytics for more than a decade, and now that the REC has successfully shepherded a large number of physicians towards EHR adoption, Howe and Mendoza have been able to start some important work on the outpatient side, too.
“On the physician side, we are going to start with claims data first, like we did on the inpatient side, because it’s quite uniform,” Howe explained. “We intend to add more of the meaningful use data and EHR data to that, which is a lot more difficult to do. Actually, come October 1st, assuming ICD-10 stays on track this year, there will be a lot of information we can get from the claims data, so it will really be of high value.”
The value will be immediately evident to physicians, many of whom are struggling to cope with surging numbers of high-cost, high-complexity patients in need of comprehensive chronic disease management. Healthcare organizations are in desperate need of actionable insights into their patient populations, and only a small number of physicians are currently able to generate that data on their own. Regional Extension Centers and statewide health information exchanges are filling that void with big data analytics capabilities that remain elusive to smaller, individual providers, Howe says.
“We can help with population management by doing things like looking at a physician practice group and seeing how many of their patients have a chronic disease,” he said. “Then we can drill down and analyze it by physician. So each physician can look at his or her own patients and determine what services they need to be providing. We can examine co-morbidities as well and look at how many have diabetes and hypertension and lipid disease, which can really help to understand the patient population’s needs.”
“Our goal is to use the physician analytics to improve patient care long-term by having physicians log on to the system and look at data specific to their practice,” Howe added. “Ideally, they’ll be able to dial in securely and use the analytics tools themselves. They can access the information through a web-based portal that uses cloud storage to make it easy for them.”
NTREC’s job is made easier by the relatively high level of consolidation across the area’s providers. A limited number of large health systems own many of the smaller physician groups, which keeps technologies aligned, and just a few major EHR vendors have scooped up most of the customer base, which means fewer interoperability roadblocks and smoother health information exchange.
“Epic and Allscripts really dominate both the inpatient and ambulatory settings in the region,” Howe said. “We do have a few other vendors on the ambulatory side, but they are still all the mainstream vendors. By the time you make a list of the top ten ambulatory vendors, you have covered 90 percent of our market, so the number of vendors we have to deal with is pretty small.”
The streamlined landscape revolves around NTREC’s core REMPI technology, which helps to make sure that critical patient information follows consumers as they move between health systems or physician providers. This key care coordination technology makes patient matching across the community a manageable feat – thanks to a lot of back-end work from Mendoza, her team, and the individual hospitals who benefit from the system.
“Right now, we have built everything on an SQL platform because it’s easier to maintain than some of the other options,” Mendoza said. “It’s easier to find the talent needed to support SQL, which is one of the things we had to consider when we were implementing the new warehouse.”
“It’s very flexible. We spent about two years testing the data, loading the data, and normalizing data, so we felt like we had good matching algorithms in our EMPI,” she continued. “We have been so successful with our patient matching because we have a data cleansing tool that all the hospitals are required to use before they submit their data to us. We make sure that required data elements are populated, that they are in the right format, and that we are not missing critical data elements that might affect that matching algorithm. It’s key for them to maintain a certain data quality, and that’s why our match rate is so high.”
Poor data integrity is the Achilles’ heel of all big data analytics efforts, and despite the laudable commitment from the region’s hospitals, gaps and errors can still cause headaches for the NTREC REMPI.
“I would say that hospitals are still struggling to get their clinicians to document the way they need to,” Mendoza acknowledged. “You can do a lot of back-end work with the data cleansing tool – it can tell them where there are issues or errors with the data, but that aren’t ideal from a data governance perspective. Some organizations are able to refine their processes so they don’t have to deal with those types of errors anymore. But some are not able to do that, so they just have to keep going back to the well to get the answers they need to make sure the data is clean before they get it to us.”
A strong emphasis on data governance is required to make population health management work, Howe agreed. Each hospital is responsible for ensuring its own data is up to snuff, and that means that individual organizations must be certain that it is producing quality information before it ever enters NTREC’s data warehouses for further processing. NTREC has access to data on more than 45 million encounters and 10 million patients, with records going back to 1999, which presents an enormous challenge for normalization and data extraction.
“When you have 80-plus hospitals and their clinics and physicians all submitting data, it’s going to come in many, many different formats,” Howe said. “And that’s why if you really want to do analytics across a region, you really have to have tools to help standardize that quality, or you would be just collecting data that you can’t do analytics on.”
“It’s not just a technical challenge,” he stated. “The quality of the data coming in is the biggest factor for patient matching. And, of course, if you are going to do health information exchange across organizations, you have to match a patient very accurately. If we have good quality data coming in, we get good match rates. If we get poor quality of data coming in, we get poor match rates. It doesn’t matter what your matching algorithm is.”
Establishing a data quality framework that adheres to agreed-upon standards is no easy feat for health systems that are far more used to competing with each other for market share than cooperating on policies and procedures. That’s exactly where the Dallas Fort Worth Hospital Council Education and Research Foundation’s steady, familiar presence in the region’s healthcare ecosystem becomes a major advantage.
“We call ourselves Switzerland,” Mendoza joked. “When our participants come into the building and meet to talk about projects and best practices, they put down the boxing gloves and they work on things that they all have in common. When they leave, of course, sometimes the gloves go back on. But they’re able to work together when it counts.”
“When a hospital signs an agreement to participate in the collaborative, they are agreeing not to use the data against their competitors,” she said. “The data is being used to improve quality, patient safety, and community health. It’s not being used to put on a billboard somewhere that says one hospital has better outcomes than all the rest. They agree to use it for internal purposes and for the collaborative only.”
That level of consensus is not easy to architect. Other regions have faced stinging failures, combative negotiations, and deeply disappointing setbacks when trying to bring healthcare organizations together for meaningful population health management, but artful negation coupled with sustained effort has helped Dallas-Fort Worth become a leader in regional health information exchange and big data analytics.
“Because of the trust involved here, and because of our governance structure, we have been able to develop a level of sharing of data across competing organizations within the metropolitan area that I think is unique in the country,” said Howe. “We have seen other areas try to do it, especially through regulation about data reporting. But without trust, it won’t happen. It’s not a technical issue. It’s a political issue. We’ve taken a very neutral approach when it comes to our provider organizations, which has really helped make sure they’re comfortable with what we’re doing.”
While the collaboration’s hospitals have been reluctant to share the majority of their financial data, even with NTREC and the Hospital Council, the collaborative attitude fostered by NTREC and embraced by the participating organizations has produced some valuable insights with significant implications for providers hoping to slash costs by reducing unnecessary utilization and helping patients get their health needs under control. Using claims data and mapping software, the REC has been able to identify hot spots of high emergency department usage, sometimes down to the neighborhood block.
“You have pockets of patients that are doing a majority of the utilization,” explained Mendoza. “We have even seen some patients who have more than 200 ED visits in a year. The high utilization could be due to lack of resources, drug-seeking patients, or other unknown reasons. Of course some of the privacy standards tie our hands in relation to identifying those patients from our end, so we are looking forward to a time when we have the right language in place that allows us to free the data in a way that helps providers really address their frequent fliers.”
In the meantime, hospitals can take measures into their own hands by integrating private financial metrics with the datasets developed by NTREC. “What we do have is a pretty robust data analytics tool that is like a self-service buffet,” Mendoza said. “A hospital can export a specific data query from us, and then they’ll go and add in their reimbursement rates for certain payers to create their own analysis. That’s internal to them, and we’re not privy to it.”
While Howe and Mendoza hope to increase their access to financial information as they continue to compile additional big data from EHRs and other sources, data-driven population health management is off to a very strong start in North Texas. Trust, cooperation, planning, and a slow-but-steady approach to building infrastructure and analytics capabilities are at the core of the community’s success, Howe reiterated, if healthcare organizations want to build momentum towards seeing meaningful results.
“When it comes to doing big data analytics, you have to have two things right off the bat,” he said. “The first is strong governance. You have to get all the participants in the same room so they can determine what data they are going to contribute. Get the absolute governance structure outlined at the beginning so you know what direction you’re trying to go.”
“The second thing is to start simple. If you start off with thousands of different data elements, you’re just going to drown in the data before you see any results. We started with claims data, and we found that there is a lot of really good information there that has been valuable to our hospital members even before we started adding more clinical information. I would say good governance has to start simple.”
The CHIME CIO Features is a collaboration between Xtelligent Media, LLC, and the College of Healthcare Information Management Executives (CHIME), featuring leading hospital and healthcare system CIOs and their experiences in health IT implementation and innovation. For more information about CHIME, visit CHIMEcentral.org.