TY - GEN
T1 - BOTH COMPARISON AND INDUCTION ARE INDISPENSABLE FOR CROSS-DOMAIN FEW-SHOT LEARNING
AU - Yuan, Wang
AU - Ma, Tian Xue
AU - Song, Haichuan
AU - Xie, Yuan
AU - Zhang, Zhizhong
AU - Ma, Lizhuang
N1 - Publisher Copyright:
© 2021 IEEE
PY - 2021
Y1 - 2021
N2 - Few-shot learning (FSL), aiming to extract new knowledge from very small amount of labeled samples, has attracted noticeable attentions recently. However, most of existing methods often fail when facing huge domain shift between seen and unseen classes. We think this should be attributed to the episode strategy which ignore utilizing support samples to induct the test classes. So in this paper, for the first time, we propose a bilevel episode strategy (BL-ES) to train a inductive graph network (IGN) that learn to both comparison and induction. Specifically, first, outer episodes in BL-ES simulate the cross-domain few-shot tasks constantly, while inner episodes learn to drive IGN to induct the common features of test classes. Then, the propsoed IGN captures the correlation among all samples to update meta points of each category in induction module. Finally, we introduce a geometrical constraint term utilizing meta points into the training loss, to update the nodes and edges in feature space. This way improves the robustness of training process. Extensive experiments show that our framework outperforms the state-of-the-art FSL alternatives, and are more suitable for real-world applications.
AB - Few-shot learning (FSL), aiming to extract new knowledge from very small amount of labeled samples, has attracted noticeable attentions recently. However, most of existing methods often fail when facing huge domain shift between seen and unseen classes. We think this should be attributed to the episode strategy which ignore utilizing support samples to induct the test classes. So in this paper, for the first time, we propose a bilevel episode strategy (BL-ES) to train a inductive graph network (IGN) that learn to both comparison and induction. Specifically, first, outer episodes in BL-ES simulate the cross-domain few-shot tasks constantly, while inner episodes learn to drive IGN to induct the common features of test classes. Then, the propsoed IGN captures the correlation among all samples to update meta points of each category in induction module. Finally, we introduce a geometrical constraint term utilizing meta points into the training loss, to update the nodes and edges in feature space. This way improves the robustness of training process. Extensive experiments show that our framework outperforms the state-of-the-art FSL alternatives, and are more suitable for real-world applications.
KW - Cross domain
KW - Few-shot learning
KW - graph neural network
KW - meta learning
UR - https://www.scopus.com/pages/publications/85126428374
U2 - 10.1109/ICME51207.2021.9428141
DO - 10.1109/ICME51207.2021.9428141
M3 - 会议稿件
AN - SCOPUS:85126428374
T3 - Proceedings - IEEE International Conference on Multimedia and Expo
BT - 2021 IEEE International Conference on Multimedia and Expo, ICME 2021
PB - IEEE Computer Society
T2 - 2021 IEEE International Conference on Multimedia and Expo, ICME 2021
Y2 - 5 July 2021 through 9 July 2021
ER -