Analytics in Action News

Machine Learning in Healthcare Helps Prosthetic Hands Feel

A case of machine learning in healthcare shows that algorithms and liquid metal could lead to the development of prosthetic hands having the ability to feel objects.

machine learning healthcare algorithms

Source: Getty Image

By Erin McNemar, MPA

- By using machine learning in healthcare, researchers from Florida Atlantic University's College of Engineering and Computer Science and collaborators are creating prosthetic hands that can “feel” by incorporating stretchable tactile sensors using liquid metal on the fingertips.

“Encapsulated within silicone-based elastomers, this technology provides key advantages over traditional sensors, including high conductivity, compliance, flexibility and stretchability. This hierarchical multi-finger tactile sensation integration could provide a higher level of intelligence for artificial hands,” the press release stated.

Each fingertip has more than 3,000 touch receptors that respond to pressure. The sensation felt in the fingertips is what humans rely on to manipulate objects. Individuals with upper limb amputations face a unique challenge without that pressured sense of touch.

Although there are several high-tech, dexterous prosthetics available, the ability to have a sense of touch is still lacking. The absence of sensory feedback often results in objects being dropped or crushed by prosthetic hands.

For the study, researchers used individual fingertips on the prosthetic hand to differentiate between various speeds of a sliding motion along four different textured surfaces. The textures had one variable parameter: the distance between the ridges. In order to detect the textures and speeds, researchers trained four machine learning algorithms.

There were 20 trials conducted on each of the 10 surfaces. These were used to see if the machine learning algorithms could distinguish between the ten different complex surfaces made up of randomly generated permutations of four different textures.

The result revealed that the tactile information from the liquid metal sensors were able to differentiate between the multi-textured surfaces, demonstrating a new form of hierarchical intelligence. Additionally, the machine learning algorithms were able to distinguish between all the speeds with high accuracy.

“Significant research has been done on tactile sensors for artificial hands, but there is still a need for advances in lightweight, low-cost, robust multimodal tactile sensors," Erik Engeberg, PhD, senior author, an associate professor in the Department of Ocean and Mechanical Engineering said in a press release.

"The tactile information from all the individual fingertips in our study provided the foundation for a higher hand-level of perception enabling the distinction between ten complex, multi-textured surfaces that would not have been possible using purely local information from an individual fingertip,” Engeberg continued.

“We believe that these tactile details could be useful in the future to afford a more realistic experience for prosthetic hand users through an advanced haptic display, which could enrich the amputee-prosthesis interface and prevent amputees from abandoning their prosthetic hand.”

The team of researchers compared the four different machine learning algorithms for their successful classification abilities: K-nearest neighbor (KNN), support vector machine (SVM), random forest (RF), and neural network (NN). 

The time-frequency features of the liquid metal sensors were removed to train and test the machine learning algorithms. The NN seemed to perform the best with 99.2 percent accuracy with speed and texture detection.

"The loss of an upper limb can be a daunting challenge for an individual who is trying to seamlessly engage in regular activities," Stella Batalama, PhD, dean, College of Engineering and Computer Science said in a press release.

"Although advances in prosthetic limbs have been beneficial and allow amputees to better perform their daily duties, they do not provide them with sensory information such as touch. They also don't enable them to control the prosthetic limb naturally with their minds,” Batalama continued.

“With this latest technology from our research team, we are one step closer to providing people all over the world with a more natural prosthetic device that can 'feel' and respond to its environment."

Researchers believe that this artificial intelligence technology can improve the control of prosthetic hands and the lives of those who need them.