- During many healthcare encounters, especially in the hospital setting, speed can be the difference between life and death.
Patients with serious conditions often require immediate interventions when vital signs drop or complications occur, but organizations are still struggling to develop the big data analytic skills that can allow a care team can spring into action with all the information they need to make a difference.
Healthcare’s growing familiarity with big data is making this process a little easier for some forward-thinking providers.
But real-time alerts, predictive analytics at the point of care, and instantaneous clinical decision support tools are still lingering on the wish-list for a large proportion of the provider community.
Instead of diving straight into the difficult world of immediate bedside insights, many providers are starting at the other end of the speed spectrum, with broader population health management analytics that can highlight opportunities for systemic changes and help to develop the best practices that can, in turn, inform initiatives that may bring quicker and quicker workflow changes.
Both types of tasks are critical for clinicians who want to make the best possible decisions for their patients, explains Dr. Danyal Ibrahim, Chief Data and Analytics Officer at Saint Francis Care in Connecticut, and insights of all different velocities can play important roles in improving the quality, safety, and delivery of patient care.
“Understanding the dimensions of time is a very significant challenge for analytics, and the issues vary with the timescale that you’re looking for,” said Ibrahim, who will be sharing more of his big data analytics expertise at the Value-Based Care Summit on November 15, 2016 in Boston.
“You may have a patient surveillance system that requires really large datasets to monitor their vitals and maybe do some predictive analytics about how they’re trending, and that requires immediate, truly real-time insights for the patient while they are still in your care setting.”
But population health management prioritizes different metrics and aims to achieve somewhat different results, he added. “Population health is about identifying groups of patients and figuring out a commonality around their needs. After you identify a common need, you redesign care around delivering that service or improving that outcome.”
Population health programs also tend to generate large volumes of data that is often used to track improvements over time. “But it’s not high velocity data, because the outcomes are rarely immediate,” Ibrahim points out.
“If I’m a care quality program leader, I want to see a whole data set about the entire population,” he continued. “I want to identify problems and figure out where that low-hanging fruit is so I can make large-scale changes over time. I might be looking at diabetic weight loss over months or even years, or watching ED usage rates over several performance quarters before I can see any meaningful trends. It’s a different approach to analytics.”
However, long-term population health management and real-time patient surveillance are not entirely separate animals. Both methodologies feed off of each other and inform each other – and both require the ability to integrate point-of-care insights with broader data that establishes patterns and helps to predict future events.
“You will likely use similar data for each problem,” Ibrahim explained. “For example, I might want to design a surveillance program around a patient based on a certain criteria, such as kidney function. The goal is to generate a signal or an alert when the patient exceeds a dangerous parameter.”
“So if the patient needs CAT scan study with contrast, but they are experiencing reduced kidney function and they also taking a medication that increases their risk of kidney injury, the clinician needs to receive an alert that something looks like it could go wrong if they order that specific test.”
This type of alert can prevent a serious patient safety event that could end in disaster for the individual, but whether or not an alarm goes off for one patient does not necessarily help the organization as a whole improve their diagnostic and safety skills.
In order to understand how to prevent patients from reaching the danger zone in the first place, the organization will need to take a broader view and try to glean insights from a larger number of events by gathering data on the circumstances involved, the actions of the care team, and patient outcomes.
“The data elements will overlap in those projects – I still need to know about CAT scans, medications, and kidney function – but the velocity is not the same at all,” said Ibrahim.
“I will design the algorithm differently if I want to create an analytics initiative that tracks how many patients are receiving contrast studies in the hospital, how many of them are sustaining kidney injuries, and how that is related to the medications they are taking. I don’t need that information at the point of care, but I do need it over a longer period of time to understand how my organization is performing and if we are improving or not.”
The health IT vendor market is often segmented along velocity lines, he said, with some developers targeting patient management and population health capabilities and others working to bring instantaneous results to clinicians as soon as a patient’s vital signs change.
Healthcare organizations looking for an off-the-shelf product may need to investigate multiple offerings tailored to their desired insights.
But regardless of what tool they choose, they should be careful to avoid creating data siloes that will prevent them from accessing the full continuum of information required to generate actionable results, Ibrahim advises, and they should pay close attention to ensuring high levels of completeness, accuracy and data integrity.
“No matter how you are going to use the information, data integrity is very important,” he stressed. “If you can create a solid foundation of accuracy within the data for whatever application it ends up supporting, you are going to be starting from a much better place.”
Saint Francis Care is taking the entire velocity continuum into account when it comes to one of the biggest threats to patient safety and outcomes: sepsis.
“A lot of organizations are really challenged by sepsis. It’s a very deadly disease,” Ibrahim said. Mortality rates from the insidious condition can reach up to 30 percent, according to the CDC, and septicemia may be responsible for up to 5 percent of all spending in the hospital setting.
Ibrahim and his team have designed several initiatives to reduce the impact of sepsis and brainstorm quality improvement strategies that make it easier for caregivers to detect and treat patients on the verge of a downturn.
“We have a high velocity alert program that fits within the workflow and the electronic health record,” said Ibrahim. “If the patient meets certain criteria – they may have a fever and have vital signs or lab results within the warning range – the care team will receive an alert that this patient may be septic.”
“The team can immediately reorient their approach to the patient and take action to give him antibiotics or fluids or conduct other tasks based on best practices to mitigate the problem.”
A sepsis improvement coordinator, on the other hand, will have a different priority, he added. “For quality improvement, we will need to look at reaction times and outcomes across the whole hospital. How many patients did we manage correctly? How many cases could we have addressed more quickly? What teams performed better than others, or what areas in the hospital did better, and what are our opportunities to improve?”
Quality improvement officers may examine those questions on a monthly or quarterly basis, but Saint Francis care teams can also benefit from a middle-ground approach, Ibrahim said.
“Within 24 hours of a sepsis diagnosis, we send a report card to the care team,” he explained. “We identify all the staff members involved in that patient’s care and break down all the things that have to be done when a patient is identified as septic. We color-code all those processes, so we can see that these three items are green, but this one is red. They can see that they were half an hour off on giving antibiotics, or that next time they need to be more organized when doing some other task.”
“It’s so important to provide timely feedback to the care team, because if we give them that report a week later or a month later, it’s not going to be as relevant and it’s not going to be fresh in their memory,” he continued. “So it’s not the same analytics as what we would deliver at the bedside, and it’s not what a process improvement coordinator might need, but it hits that spot right in the middle.”
The three-tiered approach allows Saint Francis Care to cover all the bases in relation to patient safety, and encourages care teams to learn from their experiences while continuing to improve. Leveraging similar pieces of big data at different speeds allows the organization to address a variety of different tasks that have a direct impact on care quality and outcomes.
“All of these pieces of insight are really relevant and work together to address the broadest possible array of issues that might improve patient care,” Ibrahim said. “It’s very important to understand the velocity spectrum so you know what data you need to capture and which tools will help you utilize them for quality improvements and the insights you really need.”
To learn more about how healthcare providers can leverage big data analytics strategies for a successful transition to the value-based care environment, sign up for a seat at the Value-Based Care Summit on November 15, 2016.
Visit ValueBasedCareSummit.com today to register.