Analytics in Action News

Machine Learning Microscope Adapts Lighting to Improve Diagnosis

The microscope leverages machine learning to adjust lighting angles, colors, and patterns to improve malaria diagnosis.

Machine learning microscope improves malaria diagnosis

Source: Thinkstock

By Jessica Kent

- Engineers at Duke University have developed a prototype microscope that adapts its lighting patterns while teaching itself the optimal settings needed to complete a given diagnostic task.

In the initial proof-of-concept study, the microscope developed a lighting pattern and classification system that allowed it to quickly detect red blood cells infected by the malaria parasite more accurately than trained physicians.

Engineers built a bowl-shaped light source with LEDs embedded throughout its surface. This design illuminates samples from different angles up to nearly 90 degrees with different colors, which casts shadows and highlights different features of the sample.

“A standard microscope illuminates a sample with the same amount of light coming from all directions, and that lighting has been optimized for human eyes over hundreds of years,” said Roarke Horstmeyer, assistant professor of biomedical engineering at Duke.

“But computers can see things humans can’t. So not only have we redesigned the hardware to provide a diverse range of lighting options, we’ve allowed the microscope to optimize the illumination for itself.”

Researchers fed the microscope hundreds of samples of malaria-infected red blood cells. The microscope learned which features of the sample were most important for diagnosing malaria and how to best highlight those features.

The microscope was able to produce resulting images that highlight the malaria parasite in a bright spot and correctly classify malaria about 90 percent of the time. Trained physicians and other machine learning algorithms only about 75 percent of the time.

“The patterns it’s picking out are ring-like with different colors that are non-uniform and are not necessarily obvious,” said Horstmeyer. “Even though the images are dimmer and noisier than what a clinician would create, the algorithm is saying it’ll live with the noise, it just really wants to get the parasite highlighted to help it make a diagnosis.”

After seeing the microscope’s initial success, the team sent the LED pattern and machine learning algorithm to a partnering lab, where it performed with a similar level of accuracy.

The team expects their model to accelerate the time it takes physicians to identify and diagnose malaria in patients.

“Physicians have to look through a thousand cells to find a single malaria parasite,” said Horstmeyer. “And because they have to zoom in so closely, they can only look at maybe a dozen at a time, and so reading a slide takes about 10 minutes. If they only had to look at a handful of cells that our microscope has already picked out in a matter of seconds, it would greatly speed up the process.”

Machine learning tools have recently been shown to improve diagnostics and medical imaging. A study published in JAMA Network Open demonstrated the potential for machine learning to identify cancerous esophagus tissue on microscopy images without the manual data input that is needed for current methods.

“Data annotation is the most time-consuming and laborious bottleneck in developing modern deep learning methods,” said Saeed Hassanpour, PhD, lead author of the study.

“Our study shows that deep learning models for histopathology slides analysis can be trained with labels only at the tissue level, thus removing the need for high-cost data annotation and creating new opportunities for expanding the application of deep learning in digital pathology.”

Engineers at Duke plan to continue to develop the machine learning algorithm and the microscope.

“We’re basically trying to impart some brains into the image acquisition process,” said Horstmeyer. “We want the microscope to use all of its degrees of freedom. So instead of just dumbly taking images, it can play around with the focus and illumination to try to get a better idea of what’s on the slide, just like a human would.”