TY - GEN
T1 - Learning to Generalize in Heterogeneous Federated Networks
AU - Chen, Cen
AU - Ye, Tiandi
AU - Wang, Li
AU - Gao, Ming
N1 - Publisher Copyright:
© 2022 ACM.
PY - 2022/10/17
Y1 - 2022/10/17
N2 - With the rapid development of the Internet of Things (IoT), the need to expand the amount of data through data-sharing to improve the model performance of edge devices has become increasingly compelling. To effectively protect data privacy while leveraging data across silos, federated learning has emerged. However, in the real world applications, federated learning inevitably faeces both data and model heterogeneity challenges. To address the heterogeneity issues in federated networks, in this work, we seek to jointly learn a global feature representation that is robust across clients and potentially also generalizable to new clients. More specifically, we propose a personalized Federated optimization framework with Meta Critic (FedMC) that efficiently captures robust and generalizable domain-invariant knowledge across clients. Extensive experiments on four public datasets show that the proposed FedMC outperforms the competing state-of-the-art methods in heterogeneous federated learning settings. We have also performed detailed ablation analysis on the importance of different components of the proposed model.
AB - With the rapid development of the Internet of Things (IoT), the need to expand the amount of data through data-sharing to improve the model performance of edge devices has become increasingly compelling. To effectively protect data privacy while leveraging data across silos, federated learning has emerged. However, in the real world applications, federated learning inevitably faeces both data and model heterogeneity challenges. To address the heterogeneity issues in federated networks, in this work, we seek to jointly learn a global feature representation that is robust across clients and potentially also generalizable to new clients. More specifically, we propose a personalized Federated optimization framework with Meta Critic (FedMC) that efficiently captures robust and generalizable domain-invariant knowledge across clients. Extensive experiments on four public datasets show that the proposed FedMC outperforms the competing state-of-the-art methods in heterogeneous federated learning settings. We have also performed detailed ablation analysis on the importance of different components of the proposed model.
KW - heterogeneous federated learning
KW - meta optimization
KW - wasserstein critic
UR - https://www.scopus.com/pages/publications/85140838733
U2 - 10.1145/3511808.3557378
DO - 10.1145/3511808.3557378
M3 - 会议稿件
AN - SCOPUS:85140838733
T3 - International Conference on Information and Knowledge Management, Proceedings
SP - 159
EP - 168
BT - CIKM 2022 - Proceedings of the 31st ACM International Conference on Information and Knowledge Management
PB - Association for Computing Machinery
T2 - 31st ACM International Conference on Information and Knowledge Management, CIKM 2022
Y2 - 17 October 2022 through 21 October 2022
ER -