News, Analysis, Trends, Management Innovations for
Clinical Laboratories and Pathology Groups

Hosted by Robert Michel

News, Analysis, Trends, Management Innovations for
Clinical Laboratories and Pathology Groups

Hosted by Robert Michel
Sign In

Enhanced functionality of software promises a giant boost in tissue analysis

Surgical pathologists may have an exciting new tool for identifying and classifying cancer-related cells. Medical researchers at Duke University are demonstrating that “active learning” software developed for finding and recognizing undersea mines can help pathologists identify and classify cancer-related cells.

The Duke research team embedded the active learning software into an existing software toolkit, called FARSIGHT, which is a collection of software modules. FARSIGHT was designed to rapidly analyze images of human tissue collected from laser-scanning microscopes, an article in University of Houston Engineering News explained. It can be scripted to accomplish a variety of automated image analysis tasks, from analyzing brain tissue to studying the effectiveness of medications.

“The results are spectacular,” said Lawrence Carin, Ph.D., Professor, Electrical Engineering Department at Duke, stated in an Office of Naval Research press release. “This could be a game-changer for medical research.”

BioPhotonics.com described the work that Badri Roysam, Ph.D., and other researchers are doing with the FARSIGHT technology. It wrote that: “FARSIGHT employs multiple segmentation algorithms developed over the past decade to automatically delineate all major cell types and structures in the brain and to calculate the relationships among them. The results can be inspected and validated efficiently by neuroscientists.” (Photo copyright BioPhotonics.com)

FARSIGHT was developed by a group at the University of Houston. The National Institutes of Health (NIH) and the Defense Advanced Research Projects Agency (DARPA) funded the project. It works by identifying cells based on a subset of examples initially labeled by a physician. The problem is that the resulting classifications can be erroneous because the computer-applied tags are based on a small sampling.

“There was no way for the biologist to pick the mathematically most informative examples for FARSIGHT to learn from,” said Badri Roysam, Ph.D., Chair of the Department of Electrical and Computer Engineering at the University of Houston, and leader of the group that developed FARSIGHT.

In order for the software to be able to perform independent analysis, researchers would often have to guide it through hundreds of sample biological images. This process was time-consuming and not conducive to the project’s goal of accelerating biotech and clinical research.

Thus, the collaboration between the two research teams made sense. The Navy faced similar challenges in analyzing images. Hard-to-detect undersea mines pose a significant threat to naval operations. To try to detect every possible target across the vast amount of seafloor to be examined would be prohibitively time-consuming for a human operator.

Carin’s team developed the ONR “active learning” software to help the navy’s mine-sweeping robots learn how to distinguish random flotsam and jetsam from unexploded mines. The software enabled the robotic systems to behave more like humans when uncertain about how to classify an object.

Now the new embedded ONR active learning software algorithms makes identification of cells more accurate. FARSIGHT’s performance is more consistent because it is now able to pick out the most valuable images from which to learn.

The ONR software allows FARSIGHT to dramatically reduce the number of samples it has to review before it can independently analyze images, added the UH Engineering News piece. The enhanced toolkit also reduced the number of cell samples that physicians and pathologists need to label. This is because the ONR algorithm automatically selects the best set of examples to teach the software.

“This is not a typical Navy [application] transition,” Carin stated. “But it is a transition to a very important medical tool used literally at hospitals around the world. There is a real chance this may save lives in the future.”

Research pathologists at the University of Pennsylvania are already applying the ONR algorithms to examine tumors from kidney cancer patients, the ONR press release stated. University of Pennsylvania’s research team is focusing on endothelial cells that form the blood vessels that supply the tumors with oxygen and nutrients.

“With the computer program having learned to pick out an endothelial cell, we have now automated this process, and it seems to be highly accurate,” said William Lee, M.D., Associate Professor of Medicine, Hematology and Oncology at the University of Pennsylvania. He is leading the research effort. “We can begin to study the endothelial cells of human cancer—something that is not being done because it’s so difficult and time-consuming.”

Currently the enhanced FARSIGHT toolkit requires only a few hours to pick out all the endothelial cells in 100 images with human accuracy. For a pathologist to manually accomplish this same task could take days or even weeks. In one use for the new technology, researchers hope to eventually improve drug treatments for different types of kidney cancer.

The ONR-enhanced FARSIGHT toolkit is one more example of how technology developed outside the clinical laboratory testing industry has the potential to provide anatomic pathologists with new capabilities to detect disease and identify potential therapies for individual patients. This new novel technology, when it is ready for use in clinical settings, will likely be adopted by many medical laboratories.

 

—Pamela Scherer McLeod

 

Related Information:

Mine-Hunting Software Helping Doctors to Identify Rare Cells in Human Cancer

Tissue Analysis Gets Boost from Cross-Disciplinary Collaboration

Software interprets 3-D confocal microscopy images

;