Analytics in Action News

Leveraging Artificial Intelligence to Reduce Clinician Burnout

Providence health system is using an artificial intelligence solution to alleviate clinician burnout and improve the care experience.

Leveraging artificial intelligence to reduce clinician burnout

Source: Thinkstock

By Jessica Kent

- While the dawn of the EHR promised streamlined, accelerated healthcare delivery, the technology can also include burdensome alerts and documentation requirements that lead to clinician burnout.

Providers often spend more time documenting than they do seeing their patients, resulting in poor care experiences and stunted patient-provider relationships.

At Providence, one of the largest health systems in the country, leaders were searching for a solution to problems stemming from EHR documentation.

“At our organization, clinician burnout and productivity were an issue. The amount of time that clinicians spend on documentation is probably the single biggest issue for our caregivers,” said B.J. Moore, executive vice president and CIO at Providence.

“We saw an opportunity to make the documentation process more seamless for our caregivers, to take that burden off their hands and allow them to focus on care delivery instead of hands-on keyboard and documentation.”

B.J. Moore, Providence Source: Xtelligent Healthcare Media

READ MORE: Over 80% of Health Execs Have Artificial Intelligence Plans in Place

To improve both the patient and provider experience, Providence recently partnered with Nuance Communications to leverage an artificial intelligence platform that uses ambient sensing technology to securely and privately listen to clinician-patient conversations. The tool can also offer workflow and knowledge automation to complement the EHR.

“We’ve got a device in the clinical setting that is listening to the conversation. That device knows who the caregiver is, who the patient is, and who the family members are in the room,” said Moore.

“And this ambient device is able to tell whether people are just chit-chatting about the weather, friends, family, the kids, whatever. And it's intelligent enough to know to ignore it, because the conversation isn’t clinical in nature. As that clinical discussion shifts, it recognizes the doctor's voice and will start adding the key points to the health record.”

The AI platform can not only listen to pertinent information from patients, but also transcribe it into the appropriate medical language – medicine, dosage, symptom descriptions – and adds that data to the EHR.

“The tool allows for a natural conversation between a patient and a caregiver. Today, care experiences often involve a clinician in front of a monitor and keyboard, trying to multi-task and deliver patient care while entering information into the EHR,” said Moore.

READ MORE: Natural Language Processing Advances Clinical Decision Making

“The goal is having this ambient, intelligent device that can listen to the conversation and put the information where it needs to be. That's where we see a big breakthrough from a caregiver perspective, as well as a patient perspective. Because as a patient, I value having a one-on-one conversation with my doctor versus waiting for them to type something in the background.”

To ensure the success of the AI tool, Moore and his team are evaluating the system in different parts of care delivery. Providence clinicians will also be able to offer feedback on the solution, Moore noted.

“We're piloting the platform in various areas. We're partnering with Microsoft and Nuance to work through some of the complications, like the quality of transcription and its ability to understand more intricate parts of the patient-caregiver experience,” he said.

“It is cutting-edge technology, so it's not a matter of installing it and having it work. It's finding the environments in which it can be most effective, and understanding how to phase it in. How do we work with our partners to make incremental improvements? We're on the ground floor of this platform versus a mature product that's in an implementation phase.”

With AI, organizations can significantly enhance transcription capabilities and highlight the most important aspects of patient-provider conversations.

“Transcription exists today, and the technology actually works very well. The AI comes in and it can intelligently determine what is filler or what may be a personal conversation, and determines what is the pertinent medical information. The intelligence piece differentiates it from just transcription,” said Moore.

“A 30-minute conversation could be pages and pages of information, whereas the AI intelligently winnows that down to the vital information while correcting for grammar, spelling, and medical context. It's like having a human being in the room intelligently transcribing it versus just standard transcription.”

For organizations seeking to implement similar tools, Moore stressed how critical pilot phases are for ensuring the effectiveness of ambient AI.

“Because it is such a nascent technology, it's not ready to just deploy across whole health systems. You would have to determine intelligent scenarios, environments, and clinical settings in which to pilot it. It’s also important to have doctors that are patient and know that they're working with early technology,” he said.

“If you get somebody that thinks it's a baked solution and you give it to them, they're going to get frustrated, not use the products, and not give you valuable feedback. If you pick a doctor or a caregiver that recognizes that it’s an early technology, in return you're going to get effective feedback.”

While the potential for AI has been well-documented in healthcare, it’s critical to remember how young the technology really is.

“This is still a work in progress,” Moore concluded.

“Artificial intelligence is still in its early, early days. So, you have to experiment with it, you have to learn with it, you have to mature with it. I remind our clinical workers that these aren't turnkey solutions. We're on a hundred-year journey and this is year three. We're still in the early days.”