TY - GEN
T1 - FedCo
T2 - 24th IEEE International Conference on High Performance Computing and Communications, 8th IEEE International Conference on Data Science and Systems, 20th IEEE International Conference on Smart City and 8th IEEE International Conference on Dependability in Sensor, Cloud and Big Data Systems and Application, HPCC/DSS/SmartCity/DependSys 2022
AU - Wei, Shuai
AU - Cao, Guitao
AU - Dai, Cheng
AU - Dai, Shengxin
AU - Guo, Bing
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - Federated learning(FL) enables multiple partici-pants to build a common, robust machine learning model without sharing data, which is a key technique to address data privacy, security, access rights, and access to heterogeneous data. However, existing FL algorithms still fall short of expectations in unsupervised learning and Non-iid data, which are two major challenges that limit the applicability and accuracy. For example, the lack of data labels or skewed feature distributions leads to loss of representativeness of the model, at the same time, tagging client data will increase a lot of costs in real application scenarios, therefore, supervised federated learning greatly limits the applicability of FL. In this paper, we propose a new unsupervised FL algorithm FedCO. In particular, at the parameter aggregation phase, we apply the momentum update queue to improve the training performance, enhance the model accuracy, and lower the labeling costs. At the local client training phase, the common queue that could be regarded as a big dictionary solve the data heterogeneity. The experiment on the benchmark data set shows that our method can be compared to other supervised and semi-supervised federated learning models, which proves the effectiveness of FedCo.
AB - Federated learning(FL) enables multiple partici-pants to build a common, robust machine learning model without sharing data, which is a key technique to address data privacy, security, access rights, and access to heterogeneous data. However, existing FL algorithms still fall short of expectations in unsupervised learning and Non-iid data, which are two major challenges that limit the applicability and accuracy. For example, the lack of data labels or skewed feature distributions leads to loss of representativeness of the model, at the same time, tagging client data will increase a lot of costs in real application scenarios, therefore, supervised federated learning greatly limits the applicability of FL. In this paper, we propose a new unsupervised FL algorithm FedCO. In particular, at the parameter aggregation phase, we apply the momentum update queue to improve the training performance, enhance the model accuracy, and lower the labeling costs. At the local client training phase, the common queue that could be regarded as a big dictionary solve the data heterogeneity. The experiment on the benchmark data set shows that our method can be compared to other supervised and semi-supervised federated learning models, which proves the effectiveness of FedCo.
KW - Federated Learning
KW - Non IID
KW - unsupervised learning
UR - https://www.scopus.com/pages/publications/85152223582
U2 - 10.1109/HPCC-DSS-SmartCity-DependSys57074.2022.00192
DO - 10.1109/HPCC-DSS-SmartCity-DependSys57074.2022.00192
M3 - 会议稿件
AN - SCOPUS:85152223582
T3 - Proceedings - 24th IEEE International Conference on High Performance Computing and Communications, 8th IEEE International Conference on Data Science and Systems, 20th IEEE International Conference on Smart City and 8th IEEE International Conference on Dependability in Sensor, Cloud and Big Data Systems and Application, HPCC/DSS/SmartCity/DependSys 2022
SP - 1222
EP - 1227
BT - Proceedings - 24th IEEE International Conference on High Performance Computing and Communications, 8th IEEE International Conference on Data Science and Systems, 20th IEEE International Conference on Smart City and 8th IEEE International Conference on Dependability in Sensor, Cloud and Big Data Systems and Application, HPCC/DSS/SmartCity/DependSys 2022
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 18 December 2022 through 20 December 2022
ER -