Dynamic prototype with discriminative representation for rapid adaptation in new organ segmentation

  • Hailing Wang
  • , Yu Chen
  • , Xinyue Zhang
  • , Guitao Cao*
  • , Wenming Cao
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Recent work in label-efficient prototype-based learning have demonstrated significant potential for rapid adaptation in new organ segmentation. However, a prevalent challenge in prototypical extraction within the medical domain is semantic bias. To address this issue, we propose a Dynamic Prototype with Discriminative Representation Network (DPDRNet), to enhance the effectiveness of semantic class prototype for new organ. Specifically, we introduce a self-attention mechanism to generate dynamic prototype, enhancing the efficient utilization of local information. This is accomplished by capturing interdependencies among pixel-level prototypes from limited labeled samples. Subsequently, we design a prototype contrastive learning method to maintain the discriminative representation of dynamic prototype in the high-level feature space. This method enhances the correlation between dynamic prototype and foreground features while simultaneously increasing the distinction from background features. By incorporating a self-attention mechanism with contrastive learning, the proposed dynamic prototype exhibits enhanced generalization capabilities, facilitating more precise segmentation of new organ structures. Experimental results demonstrate that our method achieves effective performance on Cardiac and Abdominal MRI segmentation tasks.

Original languageEnglish
Article number112870
JournalPattern Recognition
Volume173
DOIs
StatePublished - May 2026

Keywords

  • Contrastive learning
  • Discriminative representation
  • Few-shot segmentation
  • Prototype learning
  • Self-attention mechanism

Fingerprint

Dive into the research topics of 'Dynamic prototype with discriminative representation for rapid adaptation in new organ segmentation'. Together they form a unique fingerprint.

Cite this