CaMIL: Causal Multiple Instance Learning for Whole Slide Image Classification

  • Kaitao Chen
  • , Shiliang Sun*
  • , Jing Zhao
  • *Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

23 Scopus citations

Abstract

Whole slide image (WSI) classification is a crucial component in automated pathology analysis. Due to the inherent challenges of high-resolution WSIs and the absence of patch-level labels, most of the proposed methods follow the multiple instance learning (MIL) formulation. While MIL has been equipped with excellent instance feature extractors and aggregators, it is prone to learn spurious associations that undermine the performance of the model. For example, relying solely on color features may lead to erroneous diagnoses due to spurious associations between the disease and the color of patches. To address this issue, we develop a causal MIL framework for WSI classification, effectively distinguishing between causal and spurious associations. Specifically, we use the expectation of the intervention P(Y |do(X)) for bag prediction rather than the traditional likelihood P(Y |X). By applying the front-door adjustment, the spurious association is effectively blocked, where the intervened mediator is aggregated from patch-level features. We evaluate our proposed method on two publicly available WSI datasets, Camelyon16 and TCGA-NSCLC. Our causal MIL framework shows outstanding performance and is plug-and-play, seamlessly integrating with various feature extractors and aggregators.

Original languageEnglish
Pages (from-to)1120-1128
Number of pages9
JournalProceedings of the AAAI Conference on Artificial Intelligence
Volume38
Issue number2
DOIs
StatePublished - 25 Mar 2024
Event38th AAAI Conference on Artificial Intelligence, AAAI 2024 - Vancouver, Canada
Duration: 20 Feb 202427 Feb 2024

Fingerprint

Dive into the research topics of 'CaMIL: Causal Multiple Instance Learning for Whole Slide Image Classification'. Together they form a unique fingerprint.

Cite this