TY - GEN
T1 - FEDERATED LEARNING VIA CONSENSUS MECHANISM ON HETEROGENEOUS DATA
T2 - 2024 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2024
AU - Zheng, Shu
AU - Ye, Tiandi
AU - Li, Xiang
AU - Gao, Ming
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Federated learning (FL) on heterogeneous data (non-IID data) has recently received great attention. Most existing methods focus on studying the convergence guarantees for the global objective. While these methods can guarantee the decrease of the global objective in each communication round, they fail to ensure risk decrease for each client. In this paper, we propose FedCOME, which introduces a consensus mechanism aiming decreased risk for each client after each training round. In particular, we allow a slight adjustment to a client's gradient on the server-side, producing an acute angle between the corrected and original gradients of participated clients. To generalize the consensus mechanism to the partial participation FL scenario, we devise a novel client sampling strategy to enhance the representativeness of the selected client subset to more accurately reflect the global population.. Training on these selected clients with the consensus mechanism could empirically lead to risk decrease for clients that are not selected. Finally, we conduct extensive experiments on four benchmark datasets to show the superiority of FedCOME against other state-of-the-art methods in terms of effectiveness, efficiency. For reproducibility, we make our source code publicly available at: https://github.com/fedcome/fedcome.
AB - Federated learning (FL) on heterogeneous data (non-IID data) has recently received great attention. Most existing methods focus on studying the convergence guarantees for the global objective. While these methods can guarantee the decrease of the global objective in each communication round, they fail to ensure risk decrease for each client. In this paper, we propose FedCOME, which introduces a consensus mechanism aiming decreased risk for each client after each training round. In particular, we allow a slight adjustment to a client's gradient on the server-side, producing an acute angle between the corrected and original gradients of participated clients. To generalize the consensus mechanism to the partial participation FL scenario, we devise a novel client sampling strategy to enhance the representativeness of the selected client subset to more accurately reflect the global population.. Training on these selected clients with the consensus mechanism could empirically lead to risk decrease for clients that are not selected. Finally, we conduct extensive experiments on four benchmark datasets to show the superiority of FedCOME against other state-of-the-art methods in terms of effectiveness, efficiency. For reproducibility, we make our source code publicly available at: https://github.com/fedcome/fedcome.
KW - Federated learning
KW - consensus mechanism
UR - https://www.scopus.com/pages/publications/85192956573
U2 - 10.1109/ICASSP48485.2024.10446892
DO - 10.1109/ICASSP48485.2024.10446892
M3 - 会议稿件
AN - SCOPUS:85192956573
T3 - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
SP - 7595
EP - 7599
BT - 2024 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2024 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 14 April 2024 through 19 April 2024
ER -