Proof-of-concept study “highlights that using AI to integrate different types of clinically-sound data to predict disease outcomes is feasible,” researchers say
Artificial intelligence (AI) and machine learning are making strides—incrementally—to demonstrate their value in the world of pathology diagnostics. However, human anatomical pathologists are usually required for a prognosis. Now, in a proof-of-concept study, researchers at Brigham and Women’s Hospital in Boston have developed a method that uses AI models to integrate multiple types of data from disparate sources to accurately assess patient outcomes for 14 different types of cancer to predict.
The process also revealed “the predictive basis of traits used to predict patient risk – a property that could be used to uncover new biomarkers,” it says Genetic engineering and biotechnology news (GENE).
Should this research become clinically actionable, anatomical pathologists could be given powerful new AI tools specifically designed to help them predict what kind of outcome a cancer patient can expect.
The Brigham scientists published their findings in the journal cancer cellentitled “Pan-cancer Integrative Histology-genomic Analysis via Multimodal Deep Learning”.

“Experts analyze a lot of evidence to predict how well a patient is doing. These early investigations form the basis for decisions about enrollment in a clinical trial or specific treatment regimens,” said Dr. Faisal Mahmood (above) in a press release from Brigham. “But this means that this multimodal prediction takes place at the expert level. We’re trying to approach the problem computationally,” he added. Should they prove clinically useful through additional studies, these findings could result in useful tools to help anatomical pathologists and clinical laboratory scientists more accurately predict what kind of outcomes cancer patients might experience. (Photo copyright: Harvard.)
AI-based predictions in pathology and clinical laboratory medicine
Brigham’s team constructed their AI model using the Cancer Genome Atlas (TCGA), a publicly available resource that contains data on many types of cancer. They then created a deep learning-based algorithm that examines information from various data sources.
Pathologists have traditionally relied on several different data sources, such as B. pathological images, genome sequencing and patient history to diagnose various types of cancer and develop prognosis.
For their research, Mahmood and his colleagues trained and validated their AI algorithm using 6,592 H/E (hematoxylin and eosin) whole-slide images (WSIs) from 5,720 cancer patients. Molecular profile features including mutational status, copy number variation and RNA sequencing expression were also input into the model to measure and explain the relative risk of cancer death.
The scientists “evaluated the effectiveness of the model by feeding it data sets from 14 cancer types as well as histological and genomic data from patients. The results showed that the models provided more accurate predictions of patient outcomes than models that only considered individual sources of information,” Brigham said in a press release.
“This work sets the stage for larger AI studies in healthcare that combine data from multiple sources,” said Faisal Mahmood, PhD, associate professor, Division of Computational Pathology, Brigham and Women’s Hospital; and Associate Member, Cancer Program, Broad Institute of MIT and Harvard, in the press release. “More broadly, our results emphasize the need to create prognostic models for computational pathologies with much larger datasets and downstream clinical trials to determine utility.”
Future predictions based on multiple data sources
The Brigham researchers also created a research tool they called Pathology-omics Research Platform for Integrative Survival Estimation (PORPOISE). This tool serves as an interactive platform that can provide prognostic markers detected by the algorithm for thousands of patients across different cancer types.
Researchers believe their algorithm reveals another role for AI technology in medical care, but more research is needed before their model can be implemented clinically. Larger datasets need to be examined, and the researchers plan to use more types of patient information, e.g. B. radiological scans, family histories and electronic medical records.
“Future work will focus on developing more focused prognostic models by curating larger multimodal datasets for individual disease models, fitting models to large independent multimodal testing cohorts, and using multimodal deep learning to predict response and therapy resistance,” the researchers said cancer cell paper states.
“As research advances in sequencing technologies such as single-cell RNA-Seq, mass cytometry, and spatial transcriptomics, these technologies will continue to mature and, combined with whole slide imaging, will gain clinical penetration, and our approach to understanding molecular biology will become increasingly spatially resolved and multimodal.” , the researchers concluded.
Anatomical pathologists may find the findings of the Brigham and Women’s Hospital research team intriguing. An AI tool that integrates data from disparate sources, analyzes that information, and provides useful insights could one day help them make more accurate cancer predictions and improve patient care.
—JP Schlingman
Related information:
AI integrates multiple data types to predict cancer outcomes
Pan-Cancer Integrative Histology Genome Analysis Using Multimodal Deep Learning
New AI technology integrates multiple data types to predict cancer outcomes
Artificial intelligence in digital pathology tends towards practical tools
Florida hospital is using a machine learning and artificial intelligence platform to reduce clinical variation in its healthcare delivery, impacting medical labs
Artificial Intelligence and Computational Pathology