Rehearsal-free Continual Language Learning via Efficient Parameter Isolation

  • Zhicheng Wang
  • , Yufang Liu
  • , Tao Ji
  • , Xiaoling Wang
  • , Yuanbin Wu
  • , Congcong Jiang
  • , Ye Chao
  • , Zhencong Han
  • , Ling Wang
  • , Xu Shao
  • , Wenqiu Zeng

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

40 Scopus citations

Abstract

We study the problem of defying catastrophic forgetting when learning a series of language processing tasks. Compared with previous methods, we emphasize the importance of not caching history tasks' data, which makes the problem more challenging. Our proposed method applies the parameter isolation strategy. For each task, it allocates a small portion of private parameters and learns them with a shared pre-trained model. To load correct parameters at testing time, we introduce a simple yet effective non-parametric method. Experiments on continual language learning benchmarks show that our method is significantly better than all existing no-data-cache methods, and is comparable (or even better) than those using historical data.

Original languageEnglish
Title of host publicationLong Papers
PublisherAssociation for Computational Linguistics (ACL)
Pages10933-10946
Number of pages14
ISBN (Electronic)9781959429722
DOIs
StatePublished - 2023
Event61st Annual Meeting of the Association for Computational Linguistics, ACL 2023 - Toronto, Canada
Duration: 9 Jul 202314 Jul 2023

Publication series

NameProceedings of the Annual Meeting of the Association for Computational Linguistics
Volume1
ISSN (Print)0736-587X

Conference

Conference61st Annual Meeting of the Association for Computational Linguistics, ACL 2023
Country/TerritoryCanada
CityToronto
Period9/07/2314/07/23

Fingerprint

Dive into the research topics of 'Rehearsal-free Continual Language Learning via Efficient Parameter Isolation'. Together they form a unique fingerprint.

Cite this