Dual-Expert Distillation Network for Few-Shot Segmentation

  • Junhang Zhang
  • , Zisong Zhuang
  • , Luwei Xiao
  • , Xingjiao Wu*
  • , Tianlong Ma
  • , Liang He
  • *Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

Few-shot segmentation has attracted growing interest owing to its value in practical applications. The primary challenge of few-shot segmentation lies in semantic information discovery, especially for query images. To tackle this issue, we propose a dual-expert distillation network (DEDN) made up of a scenario-level expert and an object-level expert to obtain semantic information from different perspectives. In DEDN, experts can learn from each other through online knowledge distillation with positive-guided Kullback-Leibler divergence. We innovate the Scenario Normalization and Object Continuity Guidance on dual experts to guarantee the various perspectives respectively. We further propose the Adaptive Weighted Fusion to adapt the trained experts to novel classes and obtain reliable fused predictions. Extensive experiments on Pascal-5i and COCO-20i show that our approach achieves state-of-the-art results.

Original languageEnglish
Title of host publicationProceedings - 2023 IEEE International Conference on Multimedia and Expo, ICME 2023
PublisherIEEE Computer Society
Pages720-725
Number of pages6
ISBN (Electronic)9781665468916
DOIs
StatePublished - 2023
Event2023 IEEE International Conference on Multimedia and Expo, ICME 2023 - Brisbane, Australia
Duration: 10 Jul 202314 Jul 2023

Publication series

NameProceedings - IEEE International Conference on Multimedia and Expo
Volume2023-July
ISSN (Print)1945-7871
ISSN (Electronic)1945-788X

Conference

Conference2023 IEEE International Conference on Multimedia and Expo, ICME 2023
Country/TerritoryAustralia
CityBrisbane
Period10/07/2314/07/23

Keywords

  • Contrastive learning
  • Few-shot segmentation
  • Knowledge distillation

Fingerprint

Dive into the research topics of 'Dual-Expert Distillation Network for Few-Shot Segmentation'. Together they form a unique fingerprint.

Cite this