Proof-of-concept study ‘highlights that using AI to integrate different types of clinically informed data to predict disease outcomes is feasible’ researchers say

Artificial intelligence (AI) and machine learning are—in stepwise fashion—making progress in demonstrating value in the world of pathology diagnostics. But human anatomic pathologists are generally required for a prognosis. Now, in a proof-of-concept study, researchers at Brigham and Women’s Hospital in Boston have developed a method that uses AI models to integrate multiple types of data from disparate sources to accurately predict patient outcomes for 14 different types of cancer.

The process also uncovered “the predictive bases of features used to predict patient risk—a property that could be used to uncover new biomarkers,” according to Genetic Engineering and Biotechnology News (GEN).

Should these research findings become clinically viable, anatomic pathologists may gain powerful new AI tools specifically designed to help them predict what type of outcome a cancer patient can expect.

The Brigham scientists published their findings in the journal Cancer Cell, titled, “Pan-cancer Integrative Histology-genomic Analysis via Multimodal Deep Learning.”

Faisal Mahmood, PhD

“Experts analyze many pieces of evidence to predict how well a patient may do. These early examinations become the basis of making decisions about enrolling in a clinical trial or specific treatment regimens,” said Faisal Mahmood, PhD (above) in a Brigham press release. “But that means that this multimodal prediction happens at the level of the expert. We’re trying to address the problem computationally,” he added. Should they be proven clinically-viable through additional studies, these findings could lead to useful tools that help anatomic pathologists and clinical laboratory scientists more accurately predict what type of outcomes cancer patient may experience. (Photo copyright: Harvard.)

AI-based Prognostics in Pathology and Clinical Laboratory Medicine

The team at Brigham constructed their AI model using The Cancer Genome Atlas (TCGA), a publicly available resource which contains data on many types of cancer. They then created a deep learning-based algorithm that examines information from different data sources.

Pathologists traditionally depend on several distinct sources of data, such as pathology images, genomic sequencing, and patient history to diagnose various cancers and help develop prognoses.

For their research, Mahmood and his colleagues trained and validated their AI algorithm on 6,592 H/E (hematoxylin and eosin) whole slide images (WSIs) from 5,720 cancer patients. Molecular profile features, which included mutation status, copy-number variation, and RNA sequencing expression, were also inputted into the model to measure and explain relative risk of cancer death. 

The scientists “evaluated the model’s efficacy by feeding it data sets from 14 cancer types as well as patient histology and genomic data. Results demonstrated that the models yielded more accurate patient outcome predictions than those incorporating only single sources of information,” states a Brigham press release.

“This work sets the stage for larger healthcare AI studies that combine data from multiple sources,” said Faisal Mahmood, PhD, Associate Professor, Division of Computational Pathology, Brigham and Women’s Hospital; and Associate Member, Cancer Program, Broad Institute of MIT and Harvard, in the press release. “In a broader sense, our findings emphasize a need for building computational pathology prognostic models with much larger datasets and downstream clinical trials to establish utility.”

Future Prognostics Based on Multiple Data Sources

The Brigham researchers also generated a research tool they dubbed the Pathology-omics Research Platform for Integrative Survival Estimation (PORPOISE). This tool serves as an interactive platform that can yield prognostic markers detected by the algorithm for thousands of patients across various cancer types.  

The researchers believe their algorithm reveals another role for AI technology in medical care, but that more research is needed before their model can be implemented clinically. Larger data sets will have to be examined and the researchers plan to use more types of patient information, such as radiology scans, family histories, and electronic medical records in future tests of their AI technology.

“Future work will focus on developing more focused prognostic models by curating larger multimodal datasets for individual disease models, adapting models to large independent multimodal test cohorts, and using multimodal deep learning for predicting response and resistance to treatment,” the Cancer Cell paper states.

“As research advances in sequencing technologies, such as single-cell RNA-seq, mass cytometry, and spatial transcriptomics, these technologies continue to mature and gain clinical penetrance, in combination with whole-slide imaging, and our approach to understanding molecular biology will become increasingly spatially resolved and multimodal,” the researchers concluded.  

Anatomic pathologists may find the Brigham and Women’s Hospital research team’s findings intriguing. An AI tool that integrates data from disparate sources, analyzes that information, and provides useful insights, could one day help them provide more accurate cancer prognoses and improve the care of their patients.   

JP Schlingman

Related Information:

AI Integrates Multiple Data Types to Predict Cancer Outcomes

Pan-cancer Integrative Histology-genomic Analysis via Multimodal Deep Learning

New AI Technology Integrates Multiple Data Types to Predict Cancer Outcomes

Artificial Intelligence in Digital Pathology Developments Lean Toward Practical Tools

Florida Hospital Utilizes Machine Learning Artificial Intelligence Platform to Reduce Clinical Variation in Its Healthcare, with Implications for Medical Laboratories

Artificial Intelligence and Computational Pathology