Healthcare Analytics, Population Health Management, Healthcare Big Data


EHR Users Want Their Time Back, and Artificial Intelligence Can Help

Artificial intelligence was everywhere at HIMSS18, but EHR vendors are more focused on the mundane than the moonshots.

Source: Thinkstock

Healthcare is a world full of adages, aphorisms, and narratives.  From “first, do no harm” and the “triple aim” to “keeping the patient in the center of care” while snatching that “low-hanging fruit,” providers and regulators seem to love summing up their goals, guidance, and priorities into pithy phrases that can help them communicate their visions.

One of those phrases is somewhat less rousing, yet it remains one of the most pertinent and frustrating themes to emerge in recent years: “resistance to change.” 

Healthcare is notorious for it – curiously so for a profession rooted in discovery, experimentation, and observation. 

During the earlier part of this fleeting decade, resistance to change reared up in opposition to the electronic health record, which promised to transform the day-to-day workings of every component of the healthcare ecosystem.

EHRs certainly delivered on that promise, but not in the way that most providers expected or desired. 

Early systems were clunky, cumbersome, and prone to sapping hours from a clinician’s day through poor design, too many alerts, and overwhelming documentation requirements.

Many would argue that newer iterations of the EHR haven’t changed significantly enough to restore the joy of medicine to overworked providers, who now have much more data but barely more insight to compound the everyday challenges of patient care.

Dissatisfaction is still at alarmingly high levels, and “physician burnout” is now just as common to hear as any of healthcare’s other favorite phrases. 

Providers often spend more time documenting than they do seeing their patients, and many stay long after the clock strikes quitting time to finish up their daily administrative tasks.

The current situation is simply untenable, and more stakeholders are starting to accept that something has to give if healthcare professionals are to have any hope of meeting their responsibilities to patients while remaining financially soluble and continuing to improve the quality of their care.

Clinicians will immediately protest that they have been demanding modifications for years now, but their vendors have not responded quickly or comprehensively enough. 

Vendors will likely fire back by pointing out that providers are suspicious of new tools and workflows by default, and that creating organizational momentum for innovative strategies is often a difficult task.

For providers and health IT developers locked into the same old patterns of friction and fractiousness, it may seem like the intolerable status quo won’t be changing any time soon. 

Not even value-based reimbursements are significantly shaking up the technology equation at the moment. 

Instead, the pressures of population health management and care coordination are generally just adding new tasks on top of the same health IT foundations, which require more-or-less manual input of information and the expenditure of significant cognitive effort to extract meaning from available data.

But as healthcare approaches its financial tipping point and patient-provider relationships become subject to the same consumerization as in other industries, little cracks in the industry’s opposition to change are letting a tantalizing new technology start to peek through.

“I want you to imagine something,” said Eric Schmidt, former Executive Chairman of Google parent company Alphabet, Inc., to a packed auditorium of clinicians and technologists at HIMSS18 in Las Vegas. 

“I want you to imagine a mic and a speaker in a room with a patient and a clinician.”  

“This system listens to the conversation, disambiguates the voices, follows the consultation, and gives suggestions to the clinician in his or her earpiece.  It transcribes the situation so everyone has a record of the complete conversation, and then it fills out and navigates the EHR.”

Schmidt calls this virtual clinician “Dr. Liz” in honor of Elizabeth Blackwell, the first female physician to graduate from an American medical school, and he believes that the healthcare industry is no more than a few years away from being able to put their laptops aside and focus fully on their patients again.

“I want you to imagine a mic and a speaker in a room with a patient and a clinician.”

A number of the technologies that would support Dr. Liz’s passive data collection and analytics skills are already available, Schmidt said during his opening keynote, and the rest are on their way. 

Many were on display, in one form or another, in the vendor halls at HIMSS18.

Underpinning this promise of completely redefining the relationship between providers, their technologies, and the ticking clock are two words that will replace many of healthcare’s current catchphrases over the next few years: artificial intelligence.

“Give them back their time to care”

Every new technology has its enthusiastic evangelists, most of whom take its potential to the extreme.  Artificial intelligence is no exception. 

Since long before the beginning of the computing age, fiction writers and futurists have been predicting that self-aware machines will offer a cheap, tireless, and error-free replacement for even the most complex of human skills.

That event horizon is actually getting closer as innovation speeds up exponentially, warned a 2017 survey of AI experts from Oxford University and Yale. 

In as little as 125 years, all human jobs are likely to be automated.  Surgeons will only have until 2053 to update their resumes.  After that, robots will be taking over the OR.

Eric Schmidt opening HIMSS18 with a keynote on artificial intelligence
Eric Schmidt opening HIMSS18 with a keynote on artificial intelligence

Source: HIMSS

Radiologists and pathologists might have even less time than that.  Machine learning has quickly started to excel at classifying images down to the individual pixel, offering more precise and accurate identification of cancers and other conditions than their human counterparts.

“We’re hearing a lot in the popular press about the power of this technology, and there are articles coming out from some very smart people that say artificial intelligence will replace certain job functions or categories,” acknowledges Mark Michalski, MD, a radiologist himself and Executive Director of the MGH & BWH Center for Clinical Data Science.

“It can be a little bit intimidating and even a little frightening.  And that’s not just for doctors.  It’s for a lot of different professionals and nonprofessionals alike.  That comes from not yet fully understanding the boundaries of this technology and what it is really capable of doing.”

Michalski spends his time working to answer those very questions.  And so far, he is not convinced that his colleagues in the radiology department will be getting their pink slips any time soon.

“AI is very, very good at performing some tasks and not very good at others.  My work is trying to understand how, why, and where artificial intelligence fails and how to push the boundaries of that,” he said. 

“From that perspective, I believe it’s reasonable to say that parts of what a radiologist does will become assisted by artificial intelligence.  But I can’t see them being replaced by AI any time soon.”

Dr. Joseph Kvedar, Vice President of Connected Health at Partners, agrees that autonomous robots are not likely to be roaming the halls of healthcare organizations in the foreseeable future.

“I won’t say never, but it’s highly unlikely that we’ll be getting rid of people,” he said.  “I’m not sure that’s even our goal with AI.  Artificial intelligence should be about getting the decision-making tools in front of people with brains.”

“AI should do the things that it can do better than humans, but only in service of freeing up humans to use their good judgment for the things computers can’t do well.  It’s about giving clinicians back their time to care.”

Providers only have a limited number of minutes with every patient, Kvedar said, and using that time on tasks that could be automated is a waste of a precious opportunity to make a connection between human beings.

“Artificial intelligence should be about getting the decision-making tools in front of people with brains.”

“We are trying to change the paradigm we’ve created, where the first priority is the EHR and the second priority is the patient,” he said. 

“That is what frustrates clinicians, and it frustrates patients as well.  If a provider can come to work and have all the data gathering and analysis done for them, can you imagine how that would change what we’ve turned healthcare into?”

Schmidt’s ambient AI scribe, Dr. Liz, may be one way to reduce the time clinicians spend on data collection and documentation. 

But the electronic health record isn’t going to disappear any time soon.  When it comes to diagnosing new concerns, stratifying risk, and recommending treatments, providers will always need something more than an ambient assistant.

That is where EHR vendors are finding their opportunity to leverage the growing interest in – and sophistication of – machine learning and AI.

Dr. Betty Rabinowitz, SVP of Solutions at NextGen Healthcare

Dr. Betty Rabinowitz, SVP of Solutions at NextGen Healthcare

Source: Xtelligent Media

“In the midst of this Tower of Babel that we have created, we’re still at the point where it’s very exciting if something actually works well,” observed Betty Rabinowitz, MD, FACP, Senior Vice President of Solutions at NextGen Healthcare.

“We have so much data available to us now, but too many vendors are exposing data for the sake of showing that they have the data to expose.  It’s very similar to what I used to ask my residents about ordering tests: could having this new information change your decision-making process?  If not, why order it?” 

“It’s the same with putting data into the EHR workflow,” she said.  “If it isn’t going to help in a concrete way for the patient at hand, then don’t ask the clinician to think about it in addition to everything else she has to synthesize.”

Using artificial intelligence to take out the trash

The temptation to cram all of the latest and greatest in deep neural nets, random decision forests, and Bayesian networks into the EHR is a strong one, especially as the amount of available data in the healthcare industry expands, data storage gets cheaper, and processing gets ever more powerful.

“We finally have enough data to start asking interesting questions around population health, comparative effectiveness, and personalizing care,” said James Golden, PhD, Senior Managing Director for PwC’s Healthcare Advisory Group. 

“And we finally have enough affordable computing power to get the answers we’re looking for.”

That wasn’t the case only a few decades ago.  Supercomputers were required to explore the possibilities of machine learning, and the idea of integrating AI into a fast-paced task like patient care was still fodder for sci-fi TV specials.

“When I did my PhD in the 90s on back propagation neural networks, we were working with an input layer, an output layer, and two middle layers,” he recalled. 

“That’s not extremely complex.  But it ran for four days on an Apple Lisa before producing results.  I can do the same computation today in a picosecond on an iPhone.  That is an enormous, staggering leap in our capabilities.”

But in contrast to the heady potential of programmable pathologists and virtual caregivers that algorithmically anticipate a patient’s every need, many of the EHR vendors that anchored the HIMSS show floor are setting their sights much, much lower – and they’re doing it on purpose.

“There are a lot of vendors who are working to use AI for some very sexy problems in healthcare, like new precision medicine approaches for rare diseases,” said Girish Venkatachaliah, Vice President, athena.Intelligence, athenahealth.

“It’s great that someone is doing that, but we’re not looking at AI as a way to solve those moonshots.  We’re focused on the decidedly unsexy stuff: taking out the garbage and doing the grunt work of reducing frictions in the system that drive up administrative burdens.”

“We’ve referred to AI before as plumbing that can take the sewage out of our current workflows.  That might be a little gross, but that’s really what our needs are right now.”

“We’re focused on the decidedly unsexy stuff: taking out the garbage and doing the grunt work.”

NextGen is taking a similar approach, said the company’s President and CEO Rusty Frantz.  Technology like artificial intelligence should be used to enable better care, and sometimes that involves working from the bottom up.

“It’s all well and good to talk about a massive global vision, and I think there is some value and importance in that,” he said. 

“There are a lot of Swiss Army knife tools out there that try to give people data to solve every problem, but that doesn’t always work.” 

“Healthcare organizations can’t look through the haystacks to find all the needles.  They need something that just hands them the needles.  They don’t have time to waste on the search.  That’s where machine learning can come in.”

Girish Venkatachaliah, Vice President, athena.Intelligence

Girish Venkatachaliah, Vice President, athena.Intelligence

Source: Xtelligent Media

The conundrum of the ubiquitous, archaic fax machine is the perfect example of how AI can solve a problem that most providers have become resigned to consider as part of their administrative lot in life, explained Venkatachaliah.

“Every doctor’s office uses a fax machine as one of their main ways to communicate, even in 2018,” he said. 

“We handle 110 million faxes per year for our clients – think about how much time that represents for their administrative staff.  Because these are public fax numbers, you can get a lot of junk communications mixed in with confidential lab reports and referrals.”

“Why not use machine learning to sort through them and differentiate the advertisements and restaurant menus from the important patient documents?”

“We can attach that patient’s faxed lab results to their electronic record, as well, and eliminate that task by automating it with machine learning.”

Even Epic Systems, which tends to think big and broad to meet the comprehensive needs of its clients, is making sure to keep its eye on the small stuff when it comes to leveraging AI. 

Improving the workflow and creating software that is “a joy to use” is admittedly a “high bar,” Epic founder and CEO Judy Faulkner said in late 2017 at her company’s annual User Group Meeting.  But machine learning can help the company get here.

Machine learning plays a major part in helping Epic customers streamline their operations and slash the time they spend on admin, said Seth Hain, R&D Division Manager for Analytics and Machine Learning.

“There’s a lot of data in the healthcare system today,” he said at HIMSS18. 

“We want to help tailor the system to pick out the most interesting information available, as well as the tasks they’re most likely to want to perform, and place them at the user’s fingertips.  That will allow the clinician to spend more time with the patient.”

Predicting clinical events in patients with hypertension, diabetes, asthma, and heart failure are high-value targets, but so are optimizing staffing levels by predicting patient flow, using natural language processing to summarize free-text documentation, and integrating very Dr. Liz-like voice control features into the workflow.

“We have a long history of working with voice recognition systems like Nuance and M*Modal,” explained Sean Bina, Vice President of Access Applications at Epic. 

“There are probably tens of thousands of physicians who use that as part of their workflows for things like note-based documentation and radiology results.”

“But now we’re expanding into other areas, including virtual assistant workflows for nurses and doctors.  Instead of a nurse always having to have her hands on the workstation, she can just say, ‘blood pressure is 120/80’ out loud, and the system will use natural language processing to fill that into the EHR flow sheets.”

Is consumerization the “killer app” that brings AI into the EHR?

In his keynote address, Schmidt lamented the lack of a “killer app” in healthcare that would force a major change in the way data is shared and utilized.

Similarly to how email and the internet revolutionized the exchange of information across entities that never saw a need to communication in a standardized manner before, healthcare needs a catalyst – one stronger than value-based care, apparently – to supercharge the adoption of better data-driven decisions.

Instead of a specific technology, however, healthcare’s killer app might be something a little more nebulous: the idea of consumerism.

Consumerism isn’t just overhauling the way patients demand to be treated and cared for.  EHR users are consumers, too, when they’re not in the office. 

They use smartphone apps to order a ride to work and their morning coffee to go; they click on recommended news stories in their feeds; they expect that their web search results give them exactly what they’re looking for every single time.

When they sign into their shopping or social media accounts, they are guaranteed to see their personalized settings saved, their preferences acknowledged, and their suggestions for new products or services to be more or less on point – or at least strongly related to data they know they have generated about themselves.

One of the reasons Amazon has quickly become such a threat to the traditional healthcare environment is its ability to use machine learning to personalize almost anything that crosses its path. 

Consumer targeting is its secret weapon, and paired with the sheer scale on which it operates, there are good reasons why other would-be competitors are gobbling up all the data assets they can get.

But EHR users have been largely left out of this push to create personalized experiences based on algorithms that learn an individual’s preferences and habits.  And there doesn’t seem to be a clear reason why.

“Personalization is so important in every aspect of how we use technology these days, but it’s completely absent in the EHR,” said Venkatachaliah. 

“Netflix can deliver a dozen movies that are to my taste in an instant, but when a clinician asks for a better way to do his job, we can only say, ‘if you don’t like it, too bad.’”

“That’s not going to cut it anymore.  Physicians have to see the same number of patients, or maybe more, than they did five years ago – plus they have to spend 50 percent of their time looking at a laptop. Why can’t we at least build a workflow based on exactly what they want to look at?”

“Netflix can deliver a dozen movies that are to my taste in an instant, but when a clinician asks for a better way to do his job, we can only say, ‘if you don’t like it, too bad.’”

Users should be able to make simple changes to the way their interfaces appear to make interacting with the system more comfortable, agrees Epic founder Judy Faulkner. 

“If I want to see this column first but you want to see that one first, there’s no reason we can’t make that happen for both of us,” she said. 

“Personalization doesn’t mean a lack of standardization.  It just means creating an experience that helps you process the information you’re getting in a way that makes sense to you.”

EHR vendors already have access to the tools to learn what individuals want out of their specific workflows, pointed out Venkatachaliah.  And if they don’t have them now, there’s no reason why they can’t borrow techniques from their peers in other consumer-driven industries.

Source: Xtelligent Media

“If I’m a physician and I tend to look at a patient’s X-ray first and then their MRI, why can’t my EHR recognize that and present the x-ray first next time I see a patient with a similar cluster of symptoms or a similar diagnosis?” he asked.  “We can use machine learning to drive that.”

“A clinician shouldn’t have to click through fifteen screens and zigzag through the application to pick the workflow that suits them.  That type of personalization might only eliminate 30 or 40 seconds per encounter.  But multiply that by the fifteen or twenty patients they see each day, and you are saving some significant time in the long run.” 

Saving time (and lives) by using AI to democratize clinical experience

Patients are in need of personalized healthcare experiences, too, but sometimes too much individualization can put them in jeopardy.

Standards of care are important, and AI can help providers walk the line between personalized medicine and validated clinical best practices.

“It tends to take years for a new clinical idea to become the standard of care,” said PwC’s James Golden.  “When you think about something like breast cancer, the best practice used to be ‘cut, poison, burn.’”

James Golden, PhD, Senior Managing Director at PwC Healthcare Advisory Group

James Golden, PhD, Senior Managing Director at PwC Healthcare Advisory Group

Source: Xtelligent Media

“Then someone said, ‘You know, if we poison before we cut, then we can cut better.’  And that proved out in practice, so now most oncologists do that.  But not everywhere.” 

“There are still places where the physician will say, ‘You know, I was trained a certain way when I did my MD in 1978, and I’m still going to do it this way because that’s what I was taught.’”

There is an important opportunity for artificial intelligence to reduce those questionable variations in practice and speed up the spread of new ideas through a very fragmented health system, he asserted.

Whether due to costs or proximity, not every patient can access a top-ranked health system with an entire department dedicated to cutting-edge research in their particular area of need.

That can leave some patients feeling as if they are not getting the best possible care – and it can leave some perfectly high-quality community health systems with an undeserved image problem.

“If you have a rare cancer, you want someone with experience in that exactly condition,” he said.  “You want the person who sees ten of them a day, not ten of them a year.  A lot of patients assume they can only get that at a massive, nationally-recognized system.”

“But when you train an AI algorithm on hundreds of thousands or millions of records, it’s like having a clinician with all of that experience at the fingertips of those who don’t have it firsthand.”

“It makes it easier for them to access those best practices, see the evidence that backs them up, and decide if they want to change the choices they were going to make.”

Democratizing insights through clinical decision support (CDS) driven by artificial intelligence may be able to improve outcomes and lower the costs of unnecessary tests, surgeries, or other treatments. 

“When you train an AI algorithm on millions of records, it’s like having all of that experience at the fingertips of those who don’t have it firsthand.”

It may also improve provider satisfaction if they feel confident that their new AI tool sets can help them make decisions that are better for their patients without adding more admin to their plates.

Unfortunately, the current CDS technology landscape offers AI developers a how-to guide on how not to integrate decision-making tools into the EHR workflow.  The overwhelming number of alerts, hard stops, pop-ups, and alarms are part of the reason why providers are so frustrated with their health IT experiences.

The next generation of CDS tools will have to be much smarter, more tailored, and less intrusive than what passes for clinical decision help today. 

Artificial intelligence can support a better way forward – if developers play their cards right as they work through the problems that plague clinicians now.

“The provider workflow is the last mile of the health IT journey, and we have not completed that journey just yet,” said NextGen’s Dr. Rabinowitz. 

“As an industry, we are just at the beginning of this very heavy lift of bringing together all the data we need with the governance, the processes, and the fine-tuning that will create the experiences we all want clinicians to have.”

For Schmidt, who has seen similar adoption dramas play out across multiple industries during his forty-year career, the future for AI in healthcare looks bright.

“We have so many of the tools already to create something like Dr. Liz and do so much more,” said Schmidt.  “All it takes is for every one of us in this room to figure out how to build it.” 

“This is really hard.  It’s really humbling, and it’s really complicated.  But if we all work together, we can save lives at a scale that is unimaginable because of the impact of these technologies.”

“And in ten years, if you ask me back again, I will bring the equivalent of Dr. Liz to join me on stage.”

This article was originally published on March 21, 2018.


Join 25,000 of your peers

Register for free to get access to all our articles, webcasts, white papers and exclusive interviews.

Our privacy policy

no, thanks

Continue to site...