TY - GEN
T1 - FedEntropy
T2 - 21st IEEE International Symposium on Parallel and Distributed Processing with Applications, 13th IEEE International Conference on Big Data and Cloud Computing, 16th IEEE International Conference on Social Computing and Networking and 13th International Conference on Sustainable Computing and Communications, ISPA/BDCloud/SocialCom/SustainCom 2023
AU - Ling, Zhiwei
AU - Yue, Zhihao
AU - Xia, Jun
AU - Wang, Ting
AU - Chen, Mingsong
AU - Lian, Xiang
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - Although various techniques have been proposed for Federated Learning (FL) to address its problem of low classification accuracy in non-IID scenarios, most of them neglect both i) distinct data distribution characteristics of heterogeneous devices (i.e., clients), and ii) benefits and hazards of local models for global model aggregation. In this paper, we present FedEntropy, an efficient FL approach with a novel two-stage dynamic client selection scheme that fully takes the above two factors into account. Unlike existing FL methods, FedEntropy firstly selects clients with high potential for benefiting global model aggregation in a coarse manner, and then further filters out inferior clients from such selected clients by using our proposed maximum entropy judgment method. Based on the pre-collected soft labels of the selected clients, FedEntropy only aggregates those local models that can maximize the overall entropy of their soft labels, thus effectively improving global model accuracy while reducing the overall communication overhead. Comprehensive experimental results on well-known benchmarks demonstrate both the superiority of FedEntropy and its compatibility with state-of-the-art FL methods.
AB - Although various techniques have been proposed for Federated Learning (FL) to address its problem of low classification accuracy in non-IID scenarios, most of them neglect both i) distinct data distribution characteristics of heterogeneous devices (i.e., clients), and ii) benefits and hazards of local models for global model aggregation. In this paper, we present FedEntropy, an efficient FL approach with a novel two-stage dynamic client selection scheme that fully takes the above two factors into account. Unlike existing FL methods, FedEntropy firstly selects clients with high potential for benefiting global model aggregation in a coarse manner, and then further filters out inferior clients from such selected clients by using our proposed maximum entropy judgment method. Based on the pre-collected soft labels of the selected clients, FedEntropy only aggregates those local models that can maximize the overall entropy of their soft labels, thus effectively improving global model accuracy while reducing the overall communication overhead. Comprehensive experimental results on well-known benchmarks demonstrate both the superiority of FedEntropy and its compatibility with state-of-the-art FL methods.
KW - Federated Learning
KW - client selection
KW - maximum entropy judgment
KW - non-IID data distribution
UR - https://www.scopus.com/pages/publications/85191341284
U2 - 10.1109/ISPA-BDCloud-SocialCom-SustainCom59178.2023.00040
DO - 10.1109/ISPA-BDCloud-SocialCom-SustainCom59178.2023.00040
M3 - 会议稿件
AN - SCOPUS:85191341284
T3 - Proceedings - 2023 IEEE International Conference on Parallel and Distributed Processing with Applications, Big Data and Cloud Computing, Sustainable Computing and Communications, Social Computing and Networking, ISPA/BDCloud/SocialCom/SustainCom 2023
SP - 56
EP - 63
BT - Proceedings - 2023 IEEE International Conference on Parallel and Distributed Processing with Applications, Big Data and Cloud Computing, Sustainable Computing and Communications, Social Computing and Networking, ISPA/BDCloud/SocialCom/SustainCom 2023
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 21 December 2023 through 24 December 2023
ER -