TY - GEN
T1 - Self-supervised Contrastive Feature Refinement for Few-Shot Class-Incremental Learning
AU - Ma, Shengjin
AU - Yuan, Wang
AU - Wang, Yiting
AU - Tan, Xin
AU - Zhang, Zhizhong
AU - Ma, Lizhuang
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2024.
PY - 2024
Y1 - 2024
N2 - Few-Shot Class-Incremental Learning (FSCIL) is to learn novel classes with few data points incrementally, without forgetting old classes. It is very hard to capture the underlying patterns and traits of the few-shot classes. To meet the challenges, we propose a Self-supervised Contrastive Feature Refinement (SCFR) framework which tackles the FSCIL issue from three aspects. Firstly, we employ a self-supervised learning framework to make the network to learn richer representations and promote feature refinement. Meanwhile, we design virtual classes to improve the models robustness and generalization during training process. To prevent catastrophic forgetting, we attach Gaussian Noise to encountered prototypes to recall the distribution of known classes and maintain stability in the embedding space. SCFR offers a systematic solution which can effectively mitigate the issues of catastrophic forgetting and over-fitting. Experiments on widely recognized datasets, including CUB200, miniImageNet and CIFAR100, show remarkable performance than other mainstream works.
AB - Few-Shot Class-Incremental Learning (FSCIL) is to learn novel classes with few data points incrementally, without forgetting old classes. It is very hard to capture the underlying patterns and traits of the few-shot classes. To meet the challenges, we propose a Self-supervised Contrastive Feature Refinement (SCFR) framework which tackles the FSCIL issue from three aspects. Firstly, we employ a self-supervised learning framework to make the network to learn richer representations and promote feature refinement. Meanwhile, we design virtual classes to improve the models robustness and generalization during training process. To prevent catastrophic forgetting, we attach Gaussian Noise to encountered prototypes to recall the distribution of known classes and maintain stability in the embedding space. SCFR offers a systematic solution which can effectively mitigate the issues of catastrophic forgetting and over-fitting. Experiments on widely recognized datasets, including CUB200, miniImageNet and CIFAR100, show remarkable performance than other mainstream works.
KW - Feature distribution recall
KW - Few-shot class-incremental learning
KW - Self-supervised learning
KW - Virtual class augmentation
UR - https://www.scopus.com/pages/publications/85185844516
U2 - 10.1007/978-981-99-9666-7_19
DO - 10.1007/978-981-99-9666-7_19
M3 - 会议稿件
AN - SCOPUS:85185844516
SN - 9789819996650
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 281
EP - 294
BT - Computer-Aided Design and Computer Graphics - 18th International Conference, CAD/Graphics 2023, Proceedings
A2 - Hu, Shi-Min
A2 - Cai, Yiyu
A2 - Rosin, Paul
PB - Springer Science and Business Media Deutschland GmbH
T2 - 18th International Conference on Computer-Aided Design and Computer Graphics, CAD/Graphics 2023
Y2 - 19 August 2023 through 21 August 2023
ER -