The healthcare industry has always had a bit of a rocky relationship with information technology, and the Internet of Things is shaping up to be no exception.
As devices get smaller and data gets bigger, providers are still struggling with meaningful ways to collect, analyze, visualize, and utilize data that can make a measurable difference in patient care.
While most of the big data debate has centered on the perceived shortcomings of electronic health records and the significant challenges of intuitive, streamlined clinical decision support, the Internet of Things is adding yet another layer of difficulty to the proposition of harnessing a growing mountain of data for improving clinical care.
Patient-generated health data (PGHD), created by IoT devices like wearables, blood glucose monitors, home scales, telehealth tools, mHealth apps, and other sensors, is a particular thorn in the side of health information management, data integrity experts, and EHR workflow designers.
This messy, unstandardized, poorly defined mass of data defies traditional information governance techniques, and suffers from a negative perception by overwhelmed, uninterested clinicians.
Moreover, the sheer volume of PGHD being generated each and every day poses major problems for overloaded analytics infrastructures that simply do not have the bandwidth to handle the torrent of big data headed their way.
Reports by Cisco and Business Insider predict that the Internet of Things – or the “Internet of Everything,” as some call it – may include 50 billion individual devices that will produce 507.5 zettabytes of data by the end of the decade. A zettabyte is one trillion gigabytes.
That mind-boggling figure, of course, encompasses the global landscape across all industries, some of which have embraced the IoT much more quickly and completely than healthcare. But if even a fraction of that data flows through the US healthcare system, the impact will be staggering.
Source: Business Insider
Not only must EHR developers and analytics experts find a way to display that data, but they must do it in a very speedy manner. At the moment, big data can take days, weeks, or even months to make the journey from its source to its end-user.
That may be acceptable for operational or revenue cycle analytics that can schedule decision making on a quarterly basis, but the patient management potential of the healthcare IoT demands a quicker approach.
A patient in the ICU may have only minutes before a dip in vital signs turns into a catastrophic crash, and if readings from his wearable sensors don’t make it to the nursing staff within that split-second time frame, his life could easily be lost. And providers need to be able to immediately contact a homebound patient with heart failure who shows a drastic drop in average step count, in case her condition has taken a turn for the worse.
To enable what is now known as “real-time analytics,” healthcare organizations often use cloud solutions as the storage layer between device and insight. Data is uploaded wholesale to the cloud and then relevant elements are identified and pulled back down into an analytics engine before being shuttled to a visualization in the user interface. Ideally, the process could only take minutes, but some patient safety and quality care decisions simply cannot wait that long.
The answer may lie not in the cloud, but in the fog.
Fog computing, also known edge computing, allows devices to conduct critical analytics on their own, without the need for the cumbersome cloud storage process. Speed is the name of the game for this type of processing, and it could be the key to making the healthcare Internet of Things truly useful for inpatient care, patient engagement, population health management, and remote monitoring.
What is fog computing?
Fog computing adds an additional layer of computing power between the device and the cloud, keeping critical analytics closer to the device, and therefore reducing the time it takes from request to answer. Individual devices each become processing nodes that can handle smaller, time-sensitive tasks without having to send all their data up to the cloud.
Because each of these devices becomes its own miniature analytics center, they can handle a wide variety of narrowly defined processes in mere milliseconds, leaving the cloud pipeline free for larger-scale analytics work.
According to Cisco, “the fog extends the cloud to be closer to the things that produce and act on IoT data. Fog applications are as diverse as the Internet of Things itself. What they have in common is monitoring or analyzing real-time data from network-connected things and then initiating an action.”
This approach has several benefits. Firstly, it maximizes available big data analytics resources by keeping some tasks out of the main cloud data storage queue and allowing them to be completed more quickly. It is simply not necessary for every analytics action to move from device to cloud and back again, Cisco adds.
“Offshore oilrigs generate 500 GB of data weekly. Commercial jets generate 10 TB for every 30 minutes of flight,” the company points out. “It is not practical to transport vast amounts of data from thousands or hundreds of thousands of edge devices to the cloud. Nor is it necessary, because many critical analyses do not require cloud-scale processing and storage.”
Fog computing devices can triage tasks based on their priority, keeping critical actions within the node, sending data that can wait a few minutes to a larger aggregation node that manages several IoT devices, and passing the rest of the data up to the cloud for long-term storage and historical analysis at the user’s leisure.
And because it is not necessary to pass every piece of data up to the cloud, devices can perform their tasks in regions where significant bandwidth or reliable broadband internet is not always available.
In the world of manufacturing, for example, the uses for these protocols are clear. Temperature gauges on chemical vats can instantly shut down processes to avoid a meltdown, or proximity sensors can detect when a worker is entering a restricted or dangerous area and immediately notify supervisors or guards.
When it comes to healthcare, fog computing may be particularly valuable for patient monitoring and crisis response in rural areas.
Instead of waiting until the patient can make the trip to their provider’s office once every two months to upload their heart rate data from their smartwatch, waiting another week or so until the data can be analyzed, and another day or two until their care manager can send an email or make a call when an abnormality is detected, the device can send an instant alert to a primary care provider when a worrying pattern emerges.
That alert can trigger a care coordination protocol at the provider’s office, or even instantly dispatch emergency services if the patient’s vitals drop dramatically.
But fog computing is not a replacement for the cloud, the OpenFog Consoritum is quick to stress in an explanatory white paper.
“Choosing between a cloud and OpenFog is not a binary decision,” says the industry group, which includes several market leaders such as Cisco, Dell, and Microsoft. “They form a mutually beneficial, inter-dependent continuum. In the continuum, the definition of what is cloud and what is endpoint is relative.”
“These devices are interdependent and mutually beneficial: certain functions are naturally more advantageous to carry out in fog while others are better suited to cloud. In many of these systems the fog and cloud will both be implemented. The segmentation of what tasks go to fog and what goes to the backend cloud are application specific, and could change dynamically.”
How can fog computing enable connected health?
The purpose of the healthcare Internet of Things is to make it easier for patients to stay connected to their providers, and for their providers to deliver accountable, value-based care to their populations. Fog computing may be the foundational infrastructure for transforming the healthcare IoT from novelty to reality.
In order to accomplish this, the healthcare industry must overcome three of its major big data obstacles: the challenge of turning big data into smart data, the geographical distribution of providers and their lack of interoperability, and the stringent patient privacy and security rules that govern the flow of sensitive health data.
As previously discussed, the ability to triage data and make the most crucial decisions within the device’s own environment will help to winnow key insights from the enormous volume of available data.
Edge computing can also take on the second and third challenges, asserts a research paper published in the IEEE Internet of Things journal earlier this year. Computer scientists from Wayne State University envision a “collaborative edge” computing ecosystem that connects health information from disparate organizations through the IoT instead of a traditional health information exchange organization.
“A key promise behind cloud computing is that the data should be already held or is being transmitted to the cloud and will eventually be processed in the cloud,” explain authors Weisong Shi, Jie Cao, Quan Zhang, Youhuizi Li and Lanyu Xu. “In many cases, however, the data owned by stakeholders is rarely shared to each other due to privacy concerns and the formidable cost of data transportation.”
Fog computing can get around these barriers by acting as miniature data processing centers that exchange data without the need for the cloud. Using predefined authorization and user protocols, a patient’s health data could be exposed to each device through a shared interface, but any computations will only take place where the data originates: at the hospital or physician office that holds the patient record.
The concept is similar to accessing a mobile version of an electronic health record from a smartphone. The smartphone doesn’t store any of the patient data itself, but it does allow the user to access or modify information held at a centralized location, based on prearranged permissions.
Source: IEEE Internet of Things Journal
Theoretically, IoT interface devices that allow permissions-based access to pieces of the same electronic record could connect pharmacies, payers, hospitals, specialists, and other healthcare stakeholders without the need to transmit a new iteration of the entire health record every time one entity wishes to make a change.
The collaborative edge ecosystem could also take advantage of the security and data integrity protocols of blockchain technology to ensure that users are allowed to view and modify certain datasets, and that all devices are accessing the same up-to-date information.
Extending the network to include more of the care continuum could improve accountability, make it easier to share data, and connect stakeholders in a cost-effective and secure manner.
In addition to patient-generated health data, the collaborative environment could also integrate operational and financial data required for improving efficiency and cutting costs, such as the GPS location of a hospital-owned tablet, information on the timing of the daily routine of a staff member with an RFID tag in her badge, or an alarm from an inventory sensor that triggers when an important medication is out of stock.
Who is driving the development of fog computing for the Internet of Things?
Some very prominent names in the IT infrastructure world are leading the charge towards a fog-based IoT.
The OpenFog Consortium, founded in November of 2015, has brought together five major IoT organizations, including ARM, Cisco, Dell, Intel, Microsoft and Princeton University Edge Computing Laboratory, to develop the governance and best practices required to advance fog computing principles.
Using open standards to create a widely accepted framework for fog computing at the very beginning of its popularity, the Consortium may be able to avoid some of the interoperability and data standards problems that have particularly plagued the healthcare industry thus far.
“Some scenarios – smart transportation, emergency services, robotics and virtual reality, to name a few – need fog computing for rapid response time, said OpenFog Executive Director Lynne Canavan in a blog post. “In remote locations, fog computing ensures continual operations, when network connectively can be challenging. On billions of devices, fog computing saves network bandwidth by shifting computation closer to the devices.”
“Enabling these and other advanced IoT scenarios are the focus of the work of the OpenFog Consortium. Our members are solving technical challenges around the eight pillars of an OpenFog architecture: Security, scalability, openness, autonomy, reliability/availability/serviceability, agility, hierarchy and programmability.”
Source: OpenFog Consortium
The group is also affiliated with The Institute of Electrical and Electronics Engineers, Schneider Electric, and GE Digital, and has launched innovation initiatives in Japan, which involve tech giants Toshiba and Fujitsu.
The Consortium also plans to work closely with additional industry groups including Industrial Internet Consortium (IIC), ETSI-MEC (Mobile Edge Computing), Open Connectivity Foundation (OCF) and the OpenNFV to avoid duplication of efforts and establish a harmonious framework for fog computing adoption across the entire Internet of Things.
At the moment, Cisco is one of the most vocal leaders of the nascent edge computing marketplace – the company takes credit for coining the term “fog computing” to begin with – but it seems likely that other major cloud storage and IoT-as-a-service companies, like Amazon Web Services, IBM, Oracle, and Google, will soon jump into this promising field.
But the healthcare industry will have plenty of its own work to do before fog computing or any other framework can produce a meaningful Internet of Things. While the value of a fog-driven IoT with millisecond-long response times seems clear in theory, providers have felt burned by new technologies before.
Crafting IoT interfaces that meet the very specific demands of the clinical workflow will continue to be a challenge, and defining the scope of what constitutes “useful data” for patient care will be another. Creating safe, federally approved medical-grade devices for data collection is no easy task, and distributing them to needy patients and educating consumers about their use will likely be a difficult proposition for some time to come.
However, the promises of the healthcare Internet of Things are generally very exciting for developers and open-minded practitioners, and fog computing may solve many of the problems that are currently stunting the growth of this innovative ecosystem. Should the IoT blossom in the manner anticipated by leading members of the healthcare community, patient care may be on the verge of yet another big data revolution.