TY - GEN
T1 - Dual-Expert Distillation Network for Few-Shot Segmentation
AU - Zhang, Junhang
AU - Zhuang, Zisong
AU - Xiao, Luwei
AU - Wu, Xingjiao
AU - Ma, Tianlong
AU - He, Liang
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - Few-shot segmentation has attracted growing interest owing to its value in practical applications. The primary challenge of few-shot segmentation lies in semantic information discovery, especially for query images. To tackle this issue, we propose a dual-expert distillation network (DEDN) made up of a scenario-level expert and an object-level expert to obtain semantic information from different perspectives. In DEDN, experts can learn from each other through online knowledge distillation with positive-guided Kullback-Leibler divergence. We innovate the Scenario Normalization and Object Continuity Guidance on dual experts to guarantee the various perspectives respectively. We further propose the Adaptive Weighted Fusion to adapt the trained experts to novel classes and obtain reliable fused predictions. Extensive experiments on Pascal-5i and COCO-20i show that our approach achieves state-of-the-art results.
AB - Few-shot segmentation has attracted growing interest owing to its value in practical applications. The primary challenge of few-shot segmentation lies in semantic information discovery, especially for query images. To tackle this issue, we propose a dual-expert distillation network (DEDN) made up of a scenario-level expert and an object-level expert to obtain semantic information from different perspectives. In DEDN, experts can learn from each other through online knowledge distillation with positive-guided Kullback-Leibler divergence. We innovate the Scenario Normalization and Object Continuity Guidance on dual experts to guarantee the various perspectives respectively. We further propose the Adaptive Weighted Fusion to adapt the trained experts to novel classes and obtain reliable fused predictions. Extensive experiments on Pascal-5i and COCO-20i show that our approach achieves state-of-the-art results.
KW - Contrastive learning
KW - Few-shot segmentation
KW - Knowledge distillation
UR - https://www.scopus.com/pages/publications/85171154020
U2 - 10.1109/ICME55011.2023.00129
DO - 10.1109/ICME55011.2023.00129
M3 - 会议稿件
AN - SCOPUS:85171154020
T3 - Proceedings - IEEE International Conference on Multimedia and Expo
SP - 720
EP - 725
BT - Proceedings - 2023 IEEE International Conference on Multimedia and Expo, ICME 2023
PB - IEEE Computer Society
T2 - 2023 IEEE International Conference on Multimedia and Expo, ICME 2023
Y2 - 10 July 2023 through 14 July 2023
ER -