TY - JOUR
T1 - Sparse personalized federated class-incremental learning
AU - Liu, Youchao
AU - Huang, Dingjiang
N1 - Publisher Copyright:
© 2025 Elsevier Inc.
PY - 2025/7
Y1 - 2025/7
N2 - Recently federated learning (FL) has attracted growing attention by performing data-private collaborative training on decentralized clients. However, the majority of existing FL methods concentrate on single-task scenarios with static data. In real-world scenarios, local clients usually continuously collect new classes from the data stream and have just a small amount of memory to store training samples of old classes. Using single-task models directly will lead to significant catastrophic forgetting in old classes. In addition, there are some typical challenges in FL scenarios, such as computation and communication overhead, data heterogeneity, etc. To comprehensively describe these challenges, we propose a new Personalized Federated Class-Incremental Learning (PFCIL) problem. Furthermore, we propose an innovative Sparse Personalized Federated Class-Incremental Learning (SpaPFCIL) framework that learns a personalized class-incremental model for each client through sparse training to solve this problem. Unlike most knowledge distillation-based methods, our framework does not require additional data to assist. Specifically, to tackle catastrophic forgetting brought by class-incremental tasks, we utilize expandable class-incremental models instead of single-task models. For typical challenges in FL, we use dynamic sparse training to customize sparse local models on clients. It alleviates the negative effects of data heterogeneity and over-parameterization. Our framework outperforms state-of-the-art methods in terms of average accuracy on representative benchmark datasets by 3.3% to 43.6%.
AB - Recently federated learning (FL) has attracted growing attention by performing data-private collaborative training on decentralized clients. However, the majority of existing FL methods concentrate on single-task scenarios with static data. In real-world scenarios, local clients usually continuously collect new classes from the data stream and have just a small amount of memory to store training samples of old classes. Using single-task models directly will lead to significant catastrophic forgetting in old classes. In addition, there are some typical challenges in FL scenarios, such as computation and communication overhead, data heterogeneity, etc. To comprehensively describe these challenges, we propose a new Personalized Federated Class-Incremental Learning (PFCIL) problem. Furthermore, we propose an innovative Sparse Personalized Federated Class-Incremental Learning (SpaPFCIL) framework that learns a personalized class-incremental model for each client through sparse training to solve this problem. Unlike most knowledge distillation-based methods, our framework does not require additional data to assist. Specifically, to tackle catastrophic forgetting brought by class-incremental tasks, we utilize expandable class-incremental models instead of single-task models. For typical challenges in FL, we use dynamic sparse training to customize sparse local models on clients. It alleviates the negative effects of data heterogeneity and over-parameterization. Our framework outperforms state-of-the-art methods in terms of average accuracy on representative benchmark datasets by 3.3% to 43.6%.
KW - Class-incremental learning
KW - Federated learning
KW - Sparse training
UR - https://www.scopus.com/pages/publications/85218105557
U2 - 10.1016/j.ins.2025.121992
DO - 10.1016/j.ins.2025.121992
M3 - 文章
AN - SCOPUS:85218105557
SN - 0020-0255
VL - 706
JO - Information Sciences
JF - Information Sciences
M1 - 121992
ER -