Bias to Balance: New-Knowledge-Preferred Few-Shot Class-Incremental Learning via Transition Calibration

Hongquan Zhang, Zhizhong Zhang, Xin Tan, Yanyun Qu, Yuan Xie

Research output: Contribution to journalArticlepeer-review

Abstract

Humans can quickly learn new concepts with limited experience, while not forgetting learned knowledge. Such ability in machine learning is referred to as few-shot class-incremental learning (FSCIL). Although some methods try to solve this problem by putting similar efforts to prevent forgetting and promote learning, we find existing techniques do not give enough importance to the new category as new training samples are rather rare. In this article, we propose a new biased-to-unbiased rectification method, which introduces a trainable transition matrix to mitigate the prediction discrepancy between the old classes and the new classes. This transition matrix is to be diagonally dominated, normalized, and differentiable with new-knowledge-preferred prior, to solving the strong bias between heavy old knowledge and limited new knowledge. Hence, we can achieve a balanced solution between learning new concepts and preventing catastrophic forgetting by giving new classes more chances. Extensive experiments on miniImagenet, CIFAR100, and CUB200 demonstrate that our method outperforms the latest state-of-the-art methods by 1.1%, 1.44%, and 2.08%, respectively.

Original languageEnglish
Pages (from-to)15347-15358
Number of pages12
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume36
Issue number8
DOIs
StatePublished - 2025

Keywords

  • Classification bias
  • few-shot
  • incremental learning
  • transition calibration

Fingerprint

Dive into the research topics of 'Bias to Balance: New-Knowledge-Preferred Few-Shot Class-Incremental Learning via Transition Calibration'. Together they form a unique fingerprint.

Cite this