In most cases, and across most industries, data standards get little respect. For some reason, their endlessly, hopelessly dense documentation detailing the minutia of splicing together millions of lines of complex code don’t seem to capture the public imagination in the same way as a slick new front-end interface or a flashy, chrome-covered device.
Most technology users are more interested in how well their tools perform rather than what drives them under the hood. But as with so many other technology tasks and topics, healthcare is an industry that stands apart from the crowd.
The Fast Healthcare Interoperability Resources, commonly known as FHIR, has done a rare thing in the world of health data: it has quickly become much more than just a dull technical standard. It has rocketed past the yawn-inducing reputation of its peers to attain superhero status in the ongoing battle against proprietary data formats, questionable information blocking practices, and other unbreakable barriers to health information exchange.
FHIR’s main enemy may be more related to unfortunate circumstances than sinister shenanigans by villainous vendors, and the standard, created and managed by HL7 International, may not end up saving the whole world.
But it does hold a great deal of promise for helping healthcare organizations move forward with the complicated task of making sophisticated and seamless population health management feasible for their technology-weary clinicians.
At Geisinger Health System, which boasts a long history of big data analytics and population health management innovation, a partnership with Cerner Corporation has brought an application-based approach to health IT development.
In addition to adopting Cerner’s HealtheIntent population health management tool as a core component of its health IT infrastructure, Geisinger is embarking on an ambitious journey towards crafting an interoperable, patient-centered, intuitive population health management ecosystem, using FHIR as one of its guiding sparks.
“We’ve had a lot of data here for a long time, and we continue to get more and more every day,” said Dr. Nicholas Marko, Chief Data Officer at Geisinger Health. “We’ve always been interested in figuring out which are the best tools on the market to help us take care of patients the absolutely best way we can. We continually assess what we have in our toolbox and what we want to add or what we need to change.”
“One of the areas we wanted to focus on is how we could get a more comprehensive view of our patients longitudinally, from a population health management perspective,” he continued.
The challenges of living in a multi-vendor world
Geisinger has worked hard to standardize its technology tools across its many care sites, but as with most healthcare systems, especially those that have added new partners or acquisitions in recent years, they are operating in a multi-vendor environment that presents certain big data challenges.
“We have a long history of using Epic, but we also have Cerner products and some other products, so we’ve got a lot of different information in many different formats,” Marko explained.
“There’s no such thing as one set of data that gives you everything you need in one single format. There will always be information coming from a number of different places, and there will always be a need to work with systems that handle that.”
Healthcare providers looking to integrate disparate data sets for population health management or other analytics use have many more options these days than they did only a few short years ago. As vendors bow to industry pressure to drop their resistance to sharing data across party lines, interoperability is turning into a selling point rather than a last resort.
“As an industry, we have to come together to solve the problem of access to our own healthcare information,” said Cerner Corporation President Zane Burke. “Patients deserve access to their data no matter where they are in the country, and no matter where their record primarily resides. They should have the ability to provide consent to have a clinician be able to pull those records whether they’re on a Cerner system or a competitor’s solution. Ultimately, that’s what we need to deliver.”
“We really need to drop interoperability as a competitive differentiator in this industry. Once everyone comes to the table and recognizes that we have a moral obligation to provide patients with their health records, I think we’re going to be much better off.”
“In healthcare, tools become more complicated as information becomes more readily available, because as we’re dealing with more and more complicated patients, we get larger and larger chunks of information that we have to visualize.”
“The demand for interoperability has skyrocketed,” agreed David McCallie, Jr., MD, Senior Vice President of Medical Informatics at Cerner, and the Director of the Cerner Medical Informatics Institute.
“Interoperability used to be the last thing you worried about. You just wanted to make sure that your own data worked well. Moving data around or interacting with external systems was something you only thought about it if you had to, but otherwise you didn’t prioritize it.”
“That’s completely changed with the shift towards pay-for-performance rather than fee-for-service,” he added. “Now, all of a sudden, interoperability is imperative.”
Value-based reimbursement contracts and accountable care organizations often require providers to shoulder responsibility for their patients no matter where those consumers are seeking care, making health information exchange and nimble, far-reaching population health management tools a must for newly vulnerable organizations.
Providers have a long history of pleading with their EHR vendors for software tools that are up to scratch for attestation to the EHR Incentive Programs and other quality-based initiatives, and it is a matter of opinion whether or not those cries have been sufficiently heard.
At Geisinger, turning the customer-vendor relationship into a proactive partnership with Cerner has led to important innovations that go far beyond a basic one-and-done EHR implementation that leaves users scratching their heads when something doesn’t work the way they want it to.
“We really like that really Cerner has been eager to work with us as collaborators and development partners, which goes beyond the traditional commercial relationship,” Marko said. “It’s more of a two-way street, where we can each take the pieces that we’re good at and help develop even better systems that work better for patients.”
Source: Xtelligent Media
“One of the interesting things about electronic health records, population health management, and the science behind data management in healthcare is that tools become more complicated as information becomes more readily available, because as we’re dealing with more and more complicated patients, we get larger and larger chunks of information that we have to visualize,” he continued.
“There is an inherent ceiling on how much innovation and novel development work you can do within the context of any EHR, because of course these are designed to be operational, transactional systems.”
Moving beyond the EHR and into an application ecosystem
While EHR vendors have generally taken great pains to create products that do much more than record clinical notes or catalogue data for billing purposes, “you’re necessarily a bit constrained because you have to abide by certain rules,” Marko argues. “You don’t necessarily have all the options that you would outside the EHR, in a world where you can use any development tools you want.”
That’s where an application-based approach can help health IT tools shine, Marko asserted. “It makes a lot of sense for us to start looking outside the EHR to develop those broader toolsets that we need to manage patients,” he said.
“However, we want to be sure that we are not making out health IT systems too cumbersome and difficult – we don’t want people having to go to multiple places at once just to get information about the same patient. What we want to do is have an extrinsic application link seamlessly with the EHR, so that interaction with the application and the process of calling up data can happen easily within the workflow.”
Proof-of-concept for the potential of the app environment came in the form of a tool called RheumPACER, which was developed with the goal of allowing Geisinger rheumatologists to access and organize patient data in a way that made the most sense for managing patients within their particular specialty.
The success of the software, created with the help of the clinicians themselves, gave Geisinger some good ideas for taking the notion a few steps further. As the health system was finalizing its partnership, Cerner was announcing the launch of a pilot program that brings FHIR standards into its Millennium EHR product.
“When we entered into this partnership with Cerner, and as their FHIR standards came online, Geisinger said, ‘Well, can we develop this piece of software into an application extension that can pass information back and forth seamlessly in the background, so that a similar set of tools becomes more readily accessible for other areas of interest?’” Marko recalled.
Initially, “the rheumatology app didn’t have anything to do with population health management per se,” McCallie noted. “But the concept that you can have an outside entity create a standards-compliant application that then is plugged back into the physician’s workflow is perfectly applicable to population health.”
“In fact, it’s necessary, because that’s really the only way to do it. Unless you want to make the clinician log in to two separate systems and try to keep track of two things going on in separate places, we think the app approach is a long-range target for how we’re going to approach population health management.”
Creating a seamless experience for the end-user is a critical step for making population health management programs truly actionable, added Marko. “We can take things that we design in just about any platform, and we can link them with traditional data management tools and databases to make that experience seamless for the end user who is consuming that information,” he said.
“That’s very important, because it gets rid of a lot of data siloes. It gets rid of a lot of the constraints about what you can and can’t develop. It also saves a lot of time and money, because you don’t have to engineer so many workarounds. That can take a lot of developer hours, and those hours add up to be very expensive in this space.”
Many of the problems with EHR interoperability have to do with a reliance on documents as the foundation for exchanging clinical information, Micky Tripathi, CEO of the Massachusetts eHealth Collaborative and Chair of the eHI Interoperability Workgroup, recently said to HealthITAnalytics.com.
“We want to move away from where we are now, which is document-based exchange. Right now, interoperability in healthcare is basically just the exchange of Consolidated Clinical Document Architecture (C-CDAs),” Tripathi said.
“The exchange of these XML documents has a certain value, because unlike in banking, where I just need raw data, whole documents are really important in clinical care,” he continued. “For banking, you can just tell how many dollars there are, or tell me how many units there are. Give me an account number, and I’m good to go. I don’t need a document. I don’t need as much context around the data in that transaction.”
However, population health management is all about context. Providers must understand a patient in a holistic manner if they are to properly address the challenges – clinical, social, and behavioral – that lead to poor outcomes, the development of chronic diseases, and the overuse of expensive services like emergency departments.
“If you just send me some lab results or a list of allergies, that’s great,” said Tripathi. “I need those things, but you haven’t told me the story of the patient, and that’s really important for a clinician to understand. Document exchange is important, but so is that data-level exchange. Health information exchange based entirely on C-CDA XML documents doesn’t allow you to access information at a data level as well.”
Source: Xtelligent Media
Transcending the document-based exchange mentality and opening up a broader playing field for big data analytics will be “the next stage of interoperability,” McCallie predicted.
“Current interoperability efforts are really focused on moving data back and forth and overcoming data reconciliation challenges. But in the future, we’re going to have to coordinate care plans across boundaries of systems. And to do that, you need apps. You need something that you can actually interact with and click on and do something with. This is a long-range vision, and it may take some years for it all to play out.”
“The FHIR standard is still quite new,” he pointed out. “It’s not even a formal standard yet – it’s still in draft status. And vendors who are implementing it are feeling their way forward to make sure they understand it, and to discover if there are any gaps or bugs, or if the specification is not actually specific enough.”
Is FHIR the magic bullet for health data interoperability?
“Boy, that’s a multi-million dollar question – I don’t know how many millions, but there are a lot of zeroes behind it,” laughed McCallie when HealthITAnalytics.com posed the query.
“We’re in the middle of a lot of hype around FHIR. There will be people who attribute magic powers to it. They will be disappointed, because it’s not magic. It’s just technology, but it’s pretty solid technology.”
It may not be the answer to everything, but FHIR has some important positive attributes in its corner that may help it see success in the healthcare environment, he said.
“One is that it’s fundamentally well-designed. It’s based on the same technologies that power the internet. We’ve had several decades of experience, now, in figuring out how to make the internet scale, and many of those good ideas are encapsulated in the core design of FHIR.”
“That’s a much better starting point than some of the previous standards used in healthcare, which were kind of customized to the industry and used technologies that anyone outside of healthcare wouldn’t understand,” he continued.
“That limited the number of developers working on these projects, and limited the number of tools that were available. It created confusion and complexity, because everything was so healthcare-specific. FHIR gives us a much better starting point, and that might be the best thing it has going for it. It’s a matter of this particular standard being in the right place at the right time, to some extent.”
Gauging FHIR’s future as the torch-bearer for data liquidity
The experts at HL7 International are doing their best to take a measured, deliberate approach to the problems in front of them,” McCallie says.
“The FHIR team is sticking with the 80-20 rule,” he said. “They want to make sure that the 80 percent of most common use cases work really, really well, and then they’ll take on the 20 percent of less common problems. That’s a great way to approach this.”
“They’re working very hard to keep it from going off the rails or getting too complicated. It’s a struggle, because there’s always this desire to solve all the problems at the same time. But if you set your sights too high, you might make things so complicated that no one can use your tools to solve simple problems.”
With so many simple problems still to tackle, and the financial pressures of value-based reimbursement quickly piling on top of providers, getting to the answers of health data interoperability has never been such an immediate concern.
Some attempts to work around the major roadblocks on the health IT highway have been more successful than others, but few have attracted as much excitement and confidence as efforts based on FHIR.
“There’s a small cadre of people empowered by HL7 to manage FHIR development, and they're very good.”
Cerner isn’t the only organization pinning its hopes on FHIR as the way forward. The ONC has promoted FHIR and application programming interfaces (APIs) as a keystone of its efforts to expand access to common clinical data sets for better big data analytics.
Epic Systems, often viewed as Cerner’s arch-rival in the EHR space, is also holding a candle for the data standard, using a FHIR-based API to allow its users access to clinical decision support and analytics insights from IBM’s Watson supercomputer.
At the FHIR Connectathon 10, held this past autumn in advance of HL7’s annual meeting, more than a dozen stakeholders registered and tested their FHIR implementations with AEGIS, a health information exchange standards testing firm. The participants included McKesson and the Mayo Clinic.
FHIR also figures highly in many private industry efforts to improve system-wide health data interoperability, including the Healthcare Services Platform Consortium (HSPC), which includes partners such as Intermountain Healthcare, the Regenstreif Institute, and the American Medical Association.
The standard may be among the most promising proposed in recent memory, but Dr. Marko cautions against seeing the framework as something completely set in stone.
“It’s always difficult, no matter what industry you’re in, to say if one particular standard is going to be the thing that goes forward in exactly its present format,” he said. “In the future, we might see some slight modifications, or we may see some major overhauls. We don’t really know.”
“I have to imagine that it’s going to be FHIR in its current form – or something that looks pretty close to it – that is going to be the persistent standard for all of this,” he acknowledged. “And it makes a lot of sense, because we are a time and a space where technology is advancing faster than any single platform can keep up with.”
FHIR may see some changes and adaptions along the way, or it may even fade in prominence as some as-yet-unthought-of technology offers a different solution.
“The bigger result of all this is that the niche is open now,” Marko stressed. “The demand is there – particularly the demand for getting information out of traditional EHRs and into more flexible environments where you can do a lot more with your data without having to compromise workflow.”
Improving interoperability and the industry’s reliance on workable data standards to the point where population health management can truly flourish may be “tedious work,” as McCallie called it, and it may take a very long time to get it right.
“But it’s very encouraging to me to see that there’s quite a bit of a vendor community behind it,” he said, “and I think that bodes well for the future of the standard, not to mention the industry as a whole.”
This article was originally published on January 12, 2016.