- As the healthcare industry looks just over the horizon to a busy, eventful, and competitive 2019, artificial intelligence tools are likely to take on an increasingly important role across the care continuum.
Payers, providers, and health IT developers are pinning many of their hopes for lower costs and better outcomes on AI’s ability to recognize subtle patterns in huge volumes of data, streamline workflows, or suggest actions likely to produce better future results.
For the Office of the National Coordinator (ONC), which is responsible for encouraging health IT adoption and fostering meaningful interoperability, artificial intelligence is both promising and potentially problematic.
Machine learning certainly offers unparalleled possibilities to turn the industry’s big data assets into actionable insights, says National Coordinator Don Rucker, MD, and the ONC is confident that AI will play a major part in improving the delivery of care.
But stakeholders may find it challenging to rapidly develop the technical foundations required to support AI, particularly the open application programming interfaces (APIs) that will allow organizations to feed data-hungry algorithms in a secure and standardized manner.
“There’s obviously extraordinary excitement around AI at the moment, just as there was during the first wave of machine learning a few decades ago,” said Rucker to HealthITAnalytics.com at the ONC 2018 Annual Meeting in Washington, DC.
Rucker, an emergency physician, computer scientist, and seasoned healthcare executive, has had an interest in artificial intelligence since graduate school. He has seen waves of enthusiasm for AI come and go – but this time, he believes that AI is likely to stick around.
“The hype is very, very similar to what is has been in the past,” he observed. “You could probably take any 20 articles about AI from the early 1980s, change the dates on the bylines, maybe switch around a few buzzwords, and those articles would be nearly identical to what’s being published today.”
“The difference, however, is in the computing power we have now. There’s a reality to AI in 2018 that wasn’t there before. The world is moving towards dedicated chips that have far more horsepower than they used to. It’s much more clear these days how AI is going to solve real problems.”
Making the leap from theory to reality will require more than just the zeal of AI aficionados. It will demand a concerted effort from all stakeholders to build the big data pipelines that allow organizations to share critical datasets, develop new algorithms, and validate machine learning models.
Coordinating that effort is a natural extension of the ONC’s original mission to digitize the country's health data, Rucker said.
“Historically, ONC has been heavily involved in helping to make health data electronic,” he said. “Without digital data, none of this happens. We’ve largely achieved that goal, so now we’re looking towards the future.”
“We use the term ‘big data’ now, because we have lots of difference systems generating lots of different types of information. But it’s mostly locked away in siloes and it’s not actually easy to get at.”
Fragmented or inaccessible data simply won’t do for AI developers looking to create the products and services of the future, he continued.
“When you’re talking about using data for generating insights, or using data to train machine learning models, you’re going to be in search of data that can help you identify patterns – differences between patient groups, or treatments, or protocols. To do that, you need more than just the data from one entity. You need to be looking across enterprises.”
That is where open APIs become essential – and that’s where the ONC comes in.
“Our role under the 21st Century Cures Act is to support the connections between disparate entities and their data while being mindful, of course, of privacy and security,” Rucker said. “Application programming interfaces are a key part of that.”
APIs act as bridges between applications or systems that may not otherwise share a common language.
APIs allow applications to request access to data held elsewhere, making it easier for engineers and data scientists to create tools that harness existing data assets or services without redeveloping everything from scratch.
In the healthcare environment, APIs have quickly become crucial for exchanging data, supporting analytics tools, and enabling access to personal health information, especially when combined with the HL7 Fast Healthcare Interoperability Resources, better known as FHIR.
The utility of APIs will need to be extended even further in order for artificial intelligence initiatives to succeed, Rucker asserted.
“Without open APIs, machine learning simply isn’t going to happen,” he stated.
“That’s because AI is like burning gasoline in a very inefficient engine. You need a lot of data to come up with even a little bit of a conclusion. APIs are going to be critical for allowing developers to access enough data to train and test meaningful models.”
However, that doesn’t mean that APIs should create a sudden data free-for-all, he added.
“I want to be clear that APIs aren’t just an open door to do whatever you want with data,” Rucker stressed. “Anyone exchanging protected health data through an API is still bound by HIPAA, and they still have to seek appropriate consents from individuals to use their data for analytics or research.”
Privacy and security are not the only challenges facing artificial intelligence developers who want to take advantage of APIs.
Currently, healthcare-specific APIs only allow applications to draw on data at the individual level, Rucker explained.
“The interoperability and patient data access conversations we’ve been having on the federal level have been largely centered around allowing an individual to access his or her personal health information. The APIs we have right now can support that, which is very, very important for the consumer-centered healthcare that we all want to see.”
“But there isn’t currently an API that will act as a firehose for data at a population level,” he said. “That’s really the missing link for getting the large volumes of anonymized, normalized data we need for AI. Without that capability, AI will be nothing in healthcare.”
Fortunately, stakeholders are already hard at work on a new generation of APIs that are designed for sharing data at scale.
Among the most prominent is the Argonaut Project, a private sector initiative to expand the capabilities and increase the adoption of HL7 FHIR.
At some point during 2019, the group expects to be able to offer the industry the ability to transfer larger volumes of health data through the FHIR protocols.
“These population-level queries are designed to use generally the same data structures and approaches as the individual queries we can complete, but scaled up to industrial strength,” said Rucker. “That's a critical building block for AI.”
Once the capability is sufficiently developed and turned loose in the healthcare environment, Rucker believes that uptake will be swift and decisive, ushering in a new generation of AI opportunities.
“I believe that within the next two years, we’re going to be using that ability broadly throughout the healthcare industry, and it is going to be revolutionary in many ways,” he predicted.
“We have to be able to do this quickly. If we can't find a way to use APIs to their full capacity, then AI is simply not going to be operationalized in healthcare.”