- Clinical decision support tools may not be as useful or reliable as clinicians could like, states a new study published in the Journal of the American Medical Informatics Association (JAMIA), and may be prone to significant, widespread, and long-term errors.
Researchers who reviewed clinical decision support (CDS) malfunctions at Brigham and Women’s Hospital found that edits to the electronic health records, mistakes in underlying databases, and EHR software upgrades all prevented the CDS system from working appropriately.
“Finding ways to improve the safety of these systems is still a work in progress,” wrote a team of authors from Partners Healthcare and other Boston-area institutions. “For example, paper-based systems cannot detect and alert clinicians of drug-drug interactions, whereas electronic clinical decision support systems can. However, implementation of health IT products does not automatically improve patient safety.”
Usability concerns, software glitches, and unauthorized workarounds can all reduce the value and effectiveness of health IT systems such as clinical decision support tools.
“Examples include when alerts stop firing due to a system upgrade, when alerts fire for the wrong patients after a drug dictionary change, and when alerts are inadvertently disabled,” the authors said.
In order to better understand the value and limitations of CDS systems, the researchers focused on four alerts that exhibited “anomalies” after a comprehensive review. At the time of the study, Brigham and Women’s Hospital, part of the Partners Healthcare network, used a proprietary longitudinal medical record system developed locally but certified for use by the Office of the National Coordinator.
The malfunctioning alerts included reminders for routine screenings and long-term monitoring of patients on certain medications.
Suggestions for monitoring the thyroid function of patients taking amiodarone, an antiarrhythmic drug with known thyroid-related side effects, failed to fire due to a rule logic change that was not implemented appropriately. The change was made in November of 2009, but the issue was not discovered and rectified until February of 2013.
Investigation into irregular patterns in lead screening alerts for two-year-olds between 2009 and 2011 revealed that software engineers had failed to document changes to rule log since 2006.
An automated audit record suggested that several changes to the clinical alert functionality had been made during this time, but “because of a software issue in the audit logging routine, it was not possible to reconstruct the sequence of rule changes or the specific dates when individual changes occurred,” the article explains.
An upgrade to the electronic health record unintentionally caused more than 3000 chlamydia screening alerts to fire in just one day, and produced a large number of other inappropriate and inaccurate alerts.
The record for one healthy two-month-old boy contained “numerous duplicate reminders, including suggestions that the physician order mammograms, Pap smears, pneumococcal vaccination, and cholesterol screening, and suggestions that the patient be started on several medications, all of which should not apply to this young, healthy, male patient,” said the authors.
Lastly, the investigators looked at an incident in 2012, when a rule changed caused the CDS system to suggest antiplatelet therapy for all patients with coronary artery disease, whether or not the patient was already taking aspirin or other antiplatelet medications.
“This service malfunctioned due to a database issue, which occurred after the server hosting the service was rebooted, that caused it to begin reporting that no drugs were in the antiplatelet classes,” the team explains. It took several weeks for engineers to pinpoint and solve the problem.
The results of these case studies led the authors to theorize that similar clinical decision support malfunctions are likely to exist in other hospitals and health systems. To test their hypothesis, they asked twenty-nine chief medical information officers (CMIOs) from different organizations about their CDS systems.
The majority of respondents used drug-drug interaction alerts, allergy alerts, and screening reminders. Less common functionalities included alerts about abnormal test results, drug-pregnancy alerts, and reminders geared directly towards patients.
When asked how often the facility has experienced a clinical decision support system malfunction, just seven percent of CMIOs said their record was completely clean. Thirty-eight percent said an identifiable error occurred four or more times per year, while 28 percent said they experienced an issue less than once a year.
Upgrades to EHR software and changes to underlying codes or data fields were the most common reasons why a CDS functionality failed to work appropriately. Forty-one percent said an inadvertent change to a rule caused a problem, while 24 percent said that database corruption or another system malfunction caused the CDS system to fail.
User reports were most helpful in uncovering errors for 83 percent of CMIOs. Forty-eight percent said they noticed a problem during their own use of the system.
While identifying issues may be common, CMIOs expressed little confidence that they can fix what’s wrong. None of the participants in the survey said that they were “totally confident” that their current processes were good enough to detect CDS malfunctions before they reach the end-user. Forty-one percent said they were “not very confident” that they could manage their CDS tools, while 21 percent said they had no faith in their established procedures.
“During our investigation process, two things stood out as surprising,” the team said. “First, although end users often knew about the CDSS malfunctions, patient safety and quality leaders and software developers were mostly unaware of the issues and were surprised by the frequency with which they occurred.”
“Second, it was quite difficult, at least in the [Brigham and Women’s longitudinal medical record], to piece together the history of changes to CDS rule logic. Change logs were maintained manually outside of the EHR systems, but they were often incomplete and did not always match the logic of actual rules running in the EHR.”
Poor documentation habits, along with the inherent difficulty of understanding when an alert should fire but doesn’t, can contribute to significant errors that go undetected for long periods of time.
As the healthcare system comes to rely more heavily on clinical decision support tools as a way to increase patient safety and engage in proactive population health management, engineers and executives will need to keep a close eye on the functionality of CDS tools and the background work required to update and maintain these complex systems.