Incremental Learning Based on Dual-Branch Network

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Incremental learning aims to overcome catastrophic forgetting. When the model learns multiple tasks sequentially, due to the imbalance of new and old classes numbers, the knowledge of old classes stored in the model is destroyed by large number of new classes. The existing single-backbone model is difficult to avoid catastrophic forgetting. In this paper, we proposes to use the dual-branch network model to learn new tasks to alleviate catastrophic forgetting. Different from previous dual-branch models that learn tasks in parallel, we propose to use dual-branch network to learn tasks serially. The model creates a new backbone for learning the remaining tasks, and freezes the previous backbone. In this way, the model can reduce damage to the previous backbone parameters used to learn old tasks. The model uses knowledge distillation to preserve the information of old tasks when the model learns new tasks. We also analyze different distillation methods for the dual-branch network model. In this paper we mainly focuses on the more challenging class incremental learning. We use common incremental learning setting on the ImageNet-100 dataset. The experimental results show that the accuracy can be improved by using the dual-branch network.

Original languageEnglish
Title of host publicationPattern Recognition and Computer Vision - 6th Chinese Conference, PRCV 2023, Proceedings
EditorsQingshan Liu, Hanzi Wang, Rongrong Ji, Zhanyu Ma, Weishi Zheng, Hongbin Zha, Xilin Chen, Liang Wang
PublisherSpringer Science and Business Media Deutschland GmbH
Pages263-272
Number of pages10
ISBN (Print)9789819984343
DOIs
StatePublished - 2024
Event6th Chinese Conference on Pattern Recognition and Computer Vision, PRCV 2023 - Xiamen, China
Duration: 13 Oct 202315 Oct 2023

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume14427 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference6th Chinese Conference on Pattern Recognition and Computer Vision, PRCV 2023
Country/TerritoryChina
CityXiamen
Period13/10/2315/10/23

Keywords

  • catastrophic forgetting
  • incremental learning
  • knowledge distillation

Fingerprint

Dive into the research topics of 'Incremental Learning Based on Dual-Branch Network'. Together they form a unique fingerprint.

Cite this