Few phrases in the healthcare IT world conjure up quite as much excitement as “artificial intelligence.”
Sweeping through vendor marketing teams like a tidal wave of opportunity, nearly every technology company has at least thought about how they can integrate machine learning algorithms into their catalogues in order to take advantage of the fever-pitch hype around this new frontier of big data analytics.
Globally, the healthcare industry is expected to increase its investment in AI by 44 percent between 2017 and 2020, according to a Tata Consultancy Services survey, meaning there are big bucks at stake for AI companies who make early moves into this lucrative new marketplace.
Healthcare-focused artificial intelligence tools are expected to bring in more than $20 billion by the middle of the 2020s, exhibiting a compound annual growth rate (CAGR) pegged at anywhere from 40 to nearly 50 percent over the next five to seven years.
In the first nine months of 2017 alone, global healthcare big data analytics companies scooped up more than $1 billion in venture capital investment, says Mercom Capital, representing a 31 percent increase over funding levels at the same time in 2016.
Genomics data processing, pharmaceutical discovery, imaging analytics, and clinical decision support tools are proliferating at a breakneck pace as start-ups and established tech giants alike try to crack the multi-billion dollar landscape.
“There’s a land rush around AI right now,” acknowledged IBM President and CEO Ginni Rometty at HIMSS17 in February. “The competitive advantage is going to come from being cognitive.”
IBM and its Watson Health division has been aggressively courting healthcare providers in recent years, vacillating between becoming one of the nascent machine learning industry’s leading success stories and one of its most sensationalized cautionary tales.
Over the past several years, IBM has secured a number of high-profile partnerships with respected healthcare stakeholders, including the Memorial Sloan Kettering, the Mayo Clinic, Teva Pharmaceuticals, the Cleveland Clinic, Epic Systems, and Quest Diagnostics.
And after a series of hefty investments in acquiring Merge Healthcare and Truven Health Analytics, the company has also published some preliminary research and statistics showcasing Watson products’ abilities to aid clinicians with decision-making around cancer, generate revenue, aid in population health management, and raise care quality.
But after MD Anderson Cancer Center’s decision to put its collaboration with IBM on hold and a string of gleefully damning press has tried to paint IBM’s efforts as a “joke” with hefty hidden price tags, narrow real-life use cases, and few tangible results.
Executives and users quickly hit back with protests that high expectations can’t change the fact that machine learning is still in its infancy, but IBM’s travails have nevertheless become a prime example of the fundamental tension underlying the market: is artificial intelligence promising too much too soon?
Is it worth investing in tools and infrastructure that have barely taken their first steps towards their full potential? And if so, how can prospective enterprise customers balance a healthy skepticism with the real possibility that machine learning will become a truly transformative force in health IT?
Getting pedantic with AI semantics
“Artificial intelligence” may have quickly become the preferred term for health IT vendors, but it is worth mentioning that most data scientists will roll their eyes at the notion that true artificial intelligence is now ready and available to the everyday tech consumer.
While a scant handful of targeted algorithms have started to approach the gold standard threshold of the Turing Test – the ability for a computer’s behavior to be indistinguishable from a human’s – the majority of “AI” tools currently in production don’t even come close.
“If somebody has pitched to you that they’re doing AI, you should probably run the other direction,” says Steve Lefar, CEO of Applied Pathways. “Almost none of these companies that say they’re doing artificial intelligence are really doing artificial intelligence.”
“Artificial intelligence assumes some level of almost consciousness and thinking from the computer,” he explained. “Machine learning, and even deep learning, are still just pattern recognition technologies that are only as good as the datasets that are fed into them.”
The vast majority of machine learning tools intended for clinical applications rely on pre-verified training data, which is used to teach the algorithm what to patterns look for in the next set of information presented to it.
The “learning” happens when the algorithm assesses its success rate, examines the outcomes, and folds that data back into itself to refine its approach for next time.
Fully-fledged artificial intelligence would do most of this automatically, and then take the next critical step of using that information to autonomously plan and execute a future action based on the probability that, taking into account the infinite number of variables in an uncontrolled environment, Decision A versus Decision B would be produce the most positive outcome without any avoidable harm.
Pattern recognition skills are certainly critical for creating a fully autonomous entity that can solve a wide variety of novel problems across multiple content domains.
But the ability to reliably recognize patterns in speech, images, text, or raw numbers is just the very first step along the long, complex pathway to full-scale AI.
The difference is important, Lefar insists, from more than just a grammarian’s point of view.
“The risks of overstating your abilities in the healthcare industry are huge,” he said. “The implications are literally life and death. Many of my fellow CEOs are concerned that all the hype hurts everyone’s cause. No one benefits from promising too much to consumers and being unable to measure up.”
Machine learning algorithms, especially those capable of unsupervised learning, can often be black box products that do not give end-users much visibility into how and why a decision is being suggested.
“The risks of overstating your abilities in the healthcare industry are huge. The implications are literally life and death."
An algorithm may present a certain option to a clinician, but unless the result also includes a detailed, transparent explanation of the data sources and logic used to come to that conclusion, healthcare providers are unlikely to trust it – and are right to hesitate, Lefar added.
“There isn’t a single doctor or nurse that will accept a recommendation without knowing all the rationale behind it – let alone a lawyer,” he said. “The legal ramifications and compliance issues of having a machine make a decision haven’t been worked out yet. We’re not prepared for it.”
Algorithms are essential for synthesizing enormous volumes of information that are simply too complex for an individual human brain to handle, Lefar readily acknowledged.
“But the trick is making sure that the algorithms and the data behind them are transparent and that the programming can provide a clear and complete account of the decision frameworks used to come to that recommendation or conclusion,” he said.
Keeping the focus on clinicians is important, agrees Sameer Badlani, MD, FACP, Chief Health Information Officer and VP at Sutter Health.
But in the day-to-day clinical environment, providers tend to care less about the engine behind their technology and more about how it concretely improves their ability to deliver care.
“What technology we use is probably more of interest to the CIO or informaticists than it is to the end user,” he said. “No physician wakes up and says, ‘You know what? Today I’m going to use artificial intelligence during my day, and it’s going to be a better day because of that.’”
“Yes, the technology is interesting and important. But how we use it to create better quality and a consistently high standard of performance ultimately matters much more than what type of algorithms are driving our insights.”
Focusing on high-value use cases with well-defined parameters can help to ensure that machine learning delivers on its ultimate promise, no matter what it’s called on the box, added Seth Hain, Product Lead for Cognitive Computing and Machine Learning at Epic Systems.
“Under the hood, tech companies may be leveraging any number of innovations like neural nets and other types of machine learning. But it’s really a matter of embedding them into the workflow where we can provide that value, and helping users understand the context of what those algorithms are doing and why.”
Epic is one company that feels it is staying on the right side of the hype as it expands the scale of its machine learning offerings, including a Cognitive Computing Platform announced at the company’s annual user group meeting in September.
The platform will leverage the Microsoft Azure cloud to support in-house development of machine learning components while also allowing the vast Epic user community to create their own tools and integrate them into their electronic health records.
The twin responsibilities of data transparency and user-centered design are also top of mind as Epic continues development, Hain said.
“If you’re a care manager and you’re working through a list of individuals, you might not always care about how that score sorted them up to the top of the list, but you need to be able to get access to why they were flagged within a couple of clicks,” he said. “Being able to show that without being able to slow down the end user will be important.”
“More and more, we will see this focus on algorithms combined with the presentation layer and how those algorithms are integrated into the user experience. To make the most of that data alongside the other information clinicians are consuming, it’s important to have an intuitive user experience. That is what we intend to target.”
Translating machine learning dreams into real-world results
Completely independent artificial intelligence may be somewhere off in the future, but machine learning tools are becoming a core component of the big data analytics landscape right now.
In addition to the endless stream of pilots, research projects, and lab experiments that are showing positive results with pattern recognition, deep learning, and natural language processing, many provider organizations have already deployed commercial machine learning products into the field.
They might not be real AI just yet, but they can certainly be incredibly sophisticated offerings that produce immediate and impactful results for patients and clinicians.
At the more than 40 hospitals of Missouri-based Mercy, for example, clinicians are using technology from Ayasdi to guide patients down standardized clinical pathways.
Since 2014, the health system has reduced mortality rates by 30 percent or more for patients treated in this manner, and reduced costs by approximately $14 million.
In New Jersey, Hackensack Meridian Health has partnered with IBM Watson for Oncology and Cota to pilot a system that can ingest and process immense volumes of data to support care for cancer patients.
“We need to equip providers with real-world evidence, backed by machine learning, to lower the financial risks of operating under new payment structures,” Dr. Andrew Pecora, Chief Innovations Officer, told HealthITAnalytics.com.
“A physician can now say to herself, ‘Okay, I was going to make one of these three choices about a chemotherapy regimen, or I was going to recommend this surgery first as opposed to doing it later. Now that I have expanded data on this, does the information support that decision, or should I reevaluate?’”
And at Sutter Health, adopting machine learning technology from Qventus is part of the system’s strategy for bringing down pharmacy costs and preventing adverse patient safety events.
“Pharmacy is often one of the biggest cost centers,” Badlani explained. “Part of that is because of the cost of medications, but errors can also contribute to the costs of readmissions and poor outcomes. Our commitment to quality was a main reason why we decided to look into machine learning as a way to reduce inefficiencies.”
By combining EHR data and pharmacy data in near real-time, the tool allows providers to make optimal choices for their patients, he said.
Source: Sutter Health
“Often in the hospital setting, IV Tylenol gets used as a pain remedy. Maybe that is appropriate for a first dose, but continuing beyond that is often not the right choice because you can change them to oral medicines that have fewer risks that might lead to a delay in discharge or a readmission.”
A system that can prompt a provider to make the switch – or ensure that a colleague has already done so – could prevent patients from receiving a therapy that is less than ideal for their current situation.
“This is the first time that I truly feel like we are approaching the promise of the EHR by leveraging the large amounts of data that we are collecting in clinical transactions,” said Badlani.
“If we can return value from the efforts of our clinicians to use the EHR to its fullest potential, there will be less angst about technology than there is currently. I believe machine learning is an important part of helping us do that.”
What to look for when partnering with a machine learning vendor
Return on investment, in clinical time or in cold, hard cash, is naturally a key motivator for organizations looking to supplement their infrastructure with machine learning.
But jumping into an agreement with a vendor without a thorough understanding of what they are offering – and how the organization is actually doing to use it – would be a costly mistake.
“Don’t try to implement the technology because it’s cool,” Badlani warned. “Too often when you talk to an analytics company, their pitch is along the lines of ‘the physician will see this and magic will happen.’”
“But there is no such thing as magic when it comes to big data. You need be sure that you are implementing a new product because you have a particular problem to solve. It was only when we made an effort to match a specific tool to a real-world use case that we started to make progress.”
Any vendor that spends more time extolling the virtues of artificial intelligence than collaborating on how to solve a concrete business problem should be avoided, he added.
“When we first started talking to our vendor, the CEO never spent more than two or three sentences describing the technology,” he recalled. “That wasn’t the selling point. His pitch was solving the problem, and I think that often gets missed in the hype about AI and machine learning.”
“When the hype becomes too much, it is often to the detriment of the organization, and also to the technology. The tool may have a lot of value, but if you build that much smoke and mirrors around anything, you’re going to get a higher level of reasonable suspicion from the CHIOs and CIOs that are responsible for purchasing these systems.”
Vendors that take the time to understand the unique needs of the organization will make better long-term partners, he advised.
“Too often when you talk to an analytics company, their pitch is along the lines of ‘the physician will see this and magic will happen.’”
And a company that pays special attention to the big data quality, integrity, and access challenges facing the majority of healthcare organizations will be an even better bet.
“End-user workflows are so important to keep at the center of the project,” Badlani said. “Physicians can barely get through the data they have that’s already piling up. Why would we want to show them more data if it doesn’t actively contribute to making their job easier or more efficient?”
While vendors are certainly responsible for designing products with intuitive and attractive workflows built to take advantage of a provider’s data assets, organizations have to keep their eyes on the big data prize as well.
“Some of the best advice that comes from one of my colleagues is to think of data like a brick,” he said. “Be careful when deciding which bricks you want to carry in your backpack and how many you can handle.”
“Everyone is excited about hoarding data, but the effort that goes into cleaning up every data brick and cementing it into place in the workflow has to be meaningful to the end user. You should focus on that process more than whether or not the system checks all the buzzword boxes of being artificially intelligent.”
Managing data in the machine learning environment is anticipated to be a difficult – but not insurmountable – task for organizations and their vendor partners, predicted Hain from Epic.
“As there’s a proliferation of these intelligence products, it’s going to be important to effectively manage them,” he said.
“That space of managing these models has not evolved very much yet. It’ll become increasingly important to pay close attention the care and feeding of those machine learning pieces by understanding how they change and what the predictive value they provide really is.”
Algorithms may not perform exactly the same way in the wild as they did during a demo, he cautioned. Variable levels of data integrity or stubborn data siloes that complicate the integration of certain elements could slightly alter the way an algorithm functions, which is important to monitor.
“You want to pay close attention to whether or not the predictions being presented are reasonably aligned with expectations,” said Hain.
“And you need to consider the actual outcomes. What happened to that patient? What action did the provider take? Was it what you expected? Being able to survey and adjust those aspects as required is going to be the key.”
Meaningful variations in data sets may not be as obvious as a blank field or missing column, Lefar pointed out.
“There is inherent bias in all datasets, and much of it is hidden,” he said. “Sample selection is almost always biased in some way or another, and sometimes the parameters are unconsciously designed to promote a certain outcome or match some preconceived notion.”
“We don’t want computers to inherit that from our natural human way of thinking. That would defeat the purpose.”
Perfection in datasets is nearly impossible to find, but organizations can be sure to invest in comprehensive data governance efforts that could go some way towards making data as clean, complete, accurate, and trusted as possible to feed their growing collection of machine learning tools.
"Think of data like a brick. Be careful when deciding which bricks you want to carry in your backpack and how many you can handle.”
A robust governance program and a focus on data integrity is just one part of the positive culture change Badlani anticipates from his organization’s early investment in machine learning.
“Always look at these kinds of projects to change the culture of your organization,” he suggested. “It is a long and complex process to change that culture, especially in a large organization like Sutter Health.”
“Financial ROI is important, certainly. But the real success is that in 5 years from now, we will be able to have 20 or 30 of these types of projects going on because we started to proactively develop the strategies and best practices for making them work. That is how you start something meaningful for the long-term. That is how you create something good.”